InFocus Blog | Dell EMC Services https://infocus.dellemc.com DELL EMC Global Services Blog Wed, 21 Feb 2018 14:18:07 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.2 Dell EMC Services Podcasts InFocus Blog | Dell EMC Services clean episodic InFocus Blog | Dell EMC Services casey.may@emc.com casey.may@emc.com (InFocus Blog | Dell EMC Services) Dell EMC Services Podcasts InFocus Blog | Dell EMC Services /wp-content/plugins/powerpress/rss_default.jpg https://infocus.dellemc.com The 3% Edge: How Data Drives Success in Business and the Olympics https://infocus.dellemc.com/william_schmarzo/at-the-3-edge-how-data-drives-success-in-business-and-the-olympics/ https://infocus.dellemc.com/william_schmarzo/at-the-3-edge-how-data-drives-success-in-business-and-the-olympics/#respond Tue, 20 Feb 2018 10:00:41 +0000 https://infocus.dellemc.com/?p=34133 A recent Bloomberg BusinessWeek article entitled “The Tech Guy Building Wearables for America’s Olympians” profiles Mounir Zok, the man in charge of the U.S. Olympic Committee’s technology and innovation. The article discusses how Mr. Zok is bringing a Silicon Valley data scientist mentality to help America’s Olympic athletes more effectively leverage data and analytics to […]

The post The 3% Edge: How Data Drives Success in Business and the Olympics appeared first on InFocus Blog | Dell EMC Services.

]]>
A recent Bloomberg BusinessWeek article entitled “The Tech Guy Building Wearables for America’s Olympians” profiles Mounir Zok, the man in charge of the U.S. Olympic Committee’s technology and innovation. The article discusses how Mr. Zok is bringing a Silicon Valley data scientist mentality to help America’s Olympic athletes more effectively leverage data and analytics to win Olympic medals.

To quote the article:

Zok won’t say who his partners were in the development process or even which athletes are using the suits; any hints might tip off Olympic engineers in other countries, erasing the USOC’s advantage.“I call it the 1 percent question,” he says.“Olympic events typically come down to a 1 percent advantage. So what’s the one question that, if we can provide an answer, will give our athletes that 1 percent edge?

Wait a second, what is this “1% edge,” and is that something that we can apply to the business world? I wanted to drill into this “1% edge” to not only verify the number, but to further understand how the “1% edge” might apply to organizations trying to effectively leverage data and analytics to power their businesses (see “Demystifying the Big Data Business Model Maturity Index”).

Verifying the 1% Edge

To start validating this 1% edge, I evaluated single athlete sports, where focusing on the singular performer is easier than a team sport. Here’s what I found.

Professional Golf. The top 5 worldwide professional golfers (as measured by strokes per round) are only 3 percent better than players #96 – #100. Even more amazing is that while the top 5 professional golfers are only separated by 3 percent in their stroke average, from golfers #96 through #100, the golfers ranked #96 – #100 earned 89.5 percent less than the top 5 (see Figure 1)!

Figure 1: YTD Statistics, Farmers Insurance Open, January 28, 2018

The 3 percent edge is quite evident in golf. Three strokes can be the difference between victory and defeat, and it also demonstrates the disparity in earning potential.

2016 Olympics Men’s Track. Next I looked at the 2016 Olympics men’s track events: 100 meter dash, 400 meter dash and marathon. The difference between the very best and those dreaming of gold medals was again only a small percentage, specifically fractions of seconds in sprinting events.

Figure 2: 2016 Olympic Men’s 100 Meter Results

 

Figure 3: 2016 Olympic Men’s 400 Meter Results

 

Figure 4: 2016 Olympic Men’s Marathon Results

In summary:

  • The difference between a gold medal and no medal was between 1.22% to 2.28%
  • The difference between a gold medal and 8th place IN THE OLYMPICS was between 2.40% to 3.67%

Think about the years of hard work and commitment these world-class athletes put into preparing for these events, only to finish out of the medals by approximately 2%. So while the “1% edge” may not be entirely accurate, I think a 1% to 3% difference on average looks about right for athletes (and organizations) that want to be considered world class.

Applying the 3% Edge to Become World Class

What does a 3 percent edge mean to your business? What does it mean to be 3 percent better in retaining customers, or bringing new products to market, or reducing hospital readmissions, or preventing unplanned maintenance?

While I couldn’t find any readily available metrics about world class in these business areas, I came back to the seminal research from Frederick F. Reichheld and W. Earl Sasser, Jr. highlighted in the classic “Harvard Business Review” article “Zero Defections: Quality Comes to Services” written in 1990. The bottom line from their research: increasing customer retention rates by 5% increases profits by 25% to 95% (see Figure 5).

Figure 5: Profitability Impact from a 5% Increase in Customer Retention

When these research results were published in 1990, they startled so many marketing executives that it set off a rush to acquire Customer Relationship Management (CRM) applications like Siebel Systems.

The Power of Compounding 1% Improvements

One of the most powerful concepts behind “hitting lots of 3 percent singles versus a single 20 percent homerun” is the concept of compounding. So what does a “3 percent compounding” actually look like? Let’s walk through a fraud example.

Let’s say you have a $1 million run-rate business with an annual 10 percent fraud rate. That results in $100K in annual fraud losses. What if, through the use of advanced analytics, you were able to reduce the fraud fate by 3 percent each year? What is the cumulative effective of a 3 percent annual improvement over five and 10 years?

Figure 6: 3% Compounded Impact on Fraud

While the results start off pretty small, it doesn’t take much time until the compounding and cumulative effects of a 3 percent improvement provide a significant financial return. And though it may not make much sense to look beyond five years (due to customer turnover, technology, evolving competition and market changes), even at five years the financial return is significant.

Take it a step further and consider the impact when combining multiple use cases, such as:

  • Waste and spoilage reduction
  • Energy effectiveness
  • Preventative maintenance
  • Unplanned network or grid downtime
  • Hospital acquired infections
  • Unplanned Hospital readmissions
  • Power outages
  • Delayed deliveries

A business that acquires a 3 percent compounding effect across numerous use cases begins to look like a business that can achieve compound growth.

Summary

I believe there is tremendous growth opportunity for organizations that have the data and analytical disciplines to drill into what a 3 percent improvement in performance might mean to the overall health of their business. Such analysis would not only highlight the power of even small improvements, but offer clarity into what parts of the business should be prioritized for further acceleration.

Sources:

Table 1: YTD Statistics, Farmers Insurance Open, January 28, 2018

The post The 3% Edge: How Data Drives Success in Business and the Olympics appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/at-the-3-edge-how-data-drives-success-in-business-and-the-olympics/feed/ 0
Don’t Follow the Money; Follow the Customer! https://infocus.dellemc.com/william_schmarzo/dont-follow-the-money-follow-the-customer/ https://infocus.dellemc.com/william_schmarzo/dont-follow-the-money-follow-the-customer/#respond Wed, 14 Feb 2018 10:00:45 +0000 https://infocus.dellemc.com/?p=33843 “Mr. Schmarzo, we’ve noticed that your cholesterol count is at 210, so we have prescribed Fluvastatin and placed selected foods into your shopping basket to help you control your cholesterol. Complete the purchase by selecting ‘here’ and we’ll deliver the medication and groceries to your home today between 4:00 and 4:20pm.  If you complete the […]

The post Don’t Follow the Money; Follow the Customer! appeared first on InFocus Blog | Dell EMC Services.

]]>
“Mr. Schmarzo, we’ve noticed that your cholesterol count is at 210, so we have prescribed Fluvastatin and placed selected foods into your shopping basket to help you control your cholesterol. Complete the purchase by selecting ‘here’ and we’ll deliver the medication and groceries to your home today between 4:00 and 4:20pm.  If you complete the full Fluvastatin prescription, then we’ll reduce your monthly healthcare insurance payment by 5%.”

This scenario is surprisingly close to reality as mergers cause traditional healthcare industry borders (healthcare provider, healthcare payer, pharmacy) to crumble.  A recent BusinessWeek article “CVS Brings One-stop Shopping to Health Care” highlighted the potential benefits of a vertical consolidation of the healthcare ecosystem players:

The [CVS – Aetna] deal would create a behemoth that would try to shift some of Aetna customers’ care away from doctors and hospitals and into thousands of CVS stores. “Think of these stores as a hub of a new way of accessing health-care services across America,” says CVS Chief Executive Officer Larry Merlo. “We’re bringing health care to where people live and work.”

Healthcare value chain vertical integrations could provide substantial benefits to everyday consumers and patients alike, including:

  • An accelerated move to value-based care (focusing on preventive care) and away from the traditional “pay by the service” model (which rewards healthcare participants for more care)
  • A reduction in some of the dysfunctional incentives built into today’s healthcare value chain, such as pharmacy benefit managers (PBMs) profiting from back-end rebates and fees extracted from pharmacy companies

Superior understanding of customers’ behaviors and preferences and product usage patterns form the basis for industry transformations.

New Normal: Business Model Disintermediation and Disruption

Industry after industry is under attack by upstart disruptors and no industry is safe. The basis for their attack is exploiting and monetizing superior customer product preferences and buying habits.The more these disruptors know about their customers – their preferences, behaviors, tendencies, inclinations, interests, passions, associations, affiliations – the better positioned they are to create new sources of value and revenue (see Figure 1).

Figure 1: Business Model Disruption and Customer Disintermediation

Established companies are being attacked by companies that are more effective at leveraging big data technologies, new sources of customer, product and operational data, and advanced analytics (machine learning, deep learning, and artificial intelligence) to:

  • Disrupt business models by applying customer, product, operational and market insights to optimize key business and operational processes. Additionally, data-driven insights uncover new sources of revenue such as new products, services, markets, audiences, channels, partners, etc.
  • Disintermediate customer relationships by exploiting detailed customer engagement behaviors and product usage tendencies to provide a more compelling and differentiated user experience.

Check out “The New Normal: Big Data Business Model Disintermediation and Disruption” for more details on business model disruption and customer disintermediation.

The following companies are challenging traditional industry business models with superior customer preferences and buying habits:

  • Uber: The world’s largest taxi company owns 0 taxis
  • Airbnb: The largest accommodation provider does not own real estate
  • TripAdvisor: The world’s largest travel company owns 0 inventory
  • Skype, Whatsapp, WeChat: The largest phone companies do not own any telco infrastructure
  • SocietyOne: The fastest growing bank has no actual money
  • eBay: One of the world’s most valuable retailer has no inventory
  • Apple & Google: The largest software vendors write a minimal number of apps
  • Facebook: The most popular media owner does not create content
  • Netflix: The world’s largest movie house does not own any cinemas or create any content (until recently)

Industry transformations will only accelerate because leading companies realize that instead of “following the money,” they should “follow the customers.

Follow the Customer

“Follow the money” is a catchphrase used to understand an organization’s flow of the money and sources of value. Organizations use accounting, auditing, investigative, data and analytic skills to “follow the money” and determine their financial value

However this infatuation with following the money can actually lead organizations astray, and make the vulnerable to disruption and disintermediation from more nimble, more customer-focused organizations. Such organizations operate in industries where:

  • The market is too fragmented for any one organization to provide a complete customer solution and experience.
  • Customer experiences are unsatisfactory.
  • Customer outcomes are questionable, or are downright wrong.
  • “Product Mentality” permeates the Senior Executive team

For example, Amazon is vertically integrating the grocery industry with their recent acquisition of Whole Foods. Where Amazon plans to take the grocery industry (as well as the entire retail industry) starts with their mission statement:

  • Traditional Grocer: “Our goal is to be the first choice for those customers who have the opportunity to shop locally”
  • Amazon: “To be Earth’s most customer-centric company, where customers can find and discover anything they might want to buy online, at get those items quickly, at the lowest possible prices”

Amazon enhances and simplifies the customer-centric experience with a host of simple, easily accessible user experience choices such as one-click buying, mobile ordering, free and same day delivery, and more.

Check out “What is Digital Transformation?” for examples of how Amazon is leveraging customer insights to vertically integrate the grocery industry.

Optimizing the Customer Experience

80% of customers want a personalized experience from their retailer. Customers don’t want to be treated as numbers on a fact sheet and love it when organizations show a semblance of personalization towards them[1].

Providing a more holistic, more engaging customer experience starts with understanding each individual customer’s behaviors., tendencies, inclinations, biases, preferences, patterns, interests, passions, associations and affiliations.  More than just capturing the customer’s purchase and social media data, leading customer-centric organizations uncover and codify 1) what products and services a customer tends to buy and 2) what products and services customers like them buy.

Amazon is arguably the industry leader in providing a highly personalized customer experience that starts with their recommendation engine (see Figure 2).

Figure 2: Amazon Recommendation Engine

Amazon recently open-sourced their artificial intelligence framework (DSSTNE: Deep Scalable Sparse Tensor Neural Engine) that powers their recommendation engine.  Amazon’s product catalog is huge, making their purchase transactions datasets extremely sparse.  This creates a significant challenge for traditional neural network frameworks, so Amazon created DSSTNE to generate recommendations that power personalized experiences across the Amazon website and Amazon devices[2].

Dell EMC Consulting uses Analytic Profiles to capture and codify a customer’s behaviors, tendencies, inclinations, biases, preferences, patterns, interests, passions, associations and affiliations (see Figure 3).

Figure 3: Customer Analytic Profile

See “Analytic Profiles: Key to Data Monetization” for more details on Analytic Profiles.

Organizations can leverage customer insights captured in the Analytic Profiles to optimize key business and operational processes, reduce security and compliance risks, uncover new revenue opportunities, and create a more compelling customer engagement lifecycle (see Figure 4).

Figure 4: Optimizing the Customer Lifecycle

See “Optimizing the Customer Lifecycle With Customer Insights” for more details on leveraging big data and data analytics to optimize your customer’s lifecycle.

Follow the Customer Summary

Leading organizations are realizing that instead of “following the money” that they should be “following their customers” and mining their customers’ buying habits regardless of artificially defined industry boundaries (see Figure 5).

It is these customer insights that will transform the organization’s business models, disintermediate under-served customers, create new sources of revenue, and eventually transform the business into an intelligent enterprise.

Sources:

[1]Generating Recommendations at Amazon Scale with Apache Spark and Amazon DSSTNE

[2]Retail: How to Keep it Personal & Take Care of Privacy

 

The post Don’t Follow the Money; Follow the Customer! appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/dont-follow-the-money-follow-the-customer/feed/ 0
Why Hardware Deployment Is the Unsung Hero in Your IT Transformation https://infocus.dellemc.com/anna_le/why-hardware-deployment-is-the-unsung-hero-in-your-it-transformation/ https://infocus.dellemc.com/anna_le/why-hardware-deployment-is-the-unsung-hero-in-your-it-transformation/#respond Tue, 13 Feb 2018 10:00:02 +0000 https://infocus.dellemc.com/?p=34094 What if you could achieve production readiness 66% faster? One of the most rewarding parts of my role at Dell EMC is meeting customers and learning how we can best help them overcome their challenges to make their IT transformation real. Customer input is a big part of what informs our transformative journey: to develop […]

The post Why Hardware Deployment Is the Unsung Hero in Your IT Transformation appeared first on InFocus Blog | Dell EMC Services.

]]>
What if you could achieve production readiness 66% faster?

One of the most rewarding parts of my role at Dell EMC is meeting customers and learning how we can best help them overcome their challenges to make their IT transformation real. Customer input is a big part of what informs our transformative journey: to develop ever-better, ever-simpler services—with a great end-to-end customer experience.

Another Milestone

Today we reached another milestone in our services journey with the unification of deployment services across the entire Dell EMC hardware portfolio.

It’s a journey that began with the unification of support services in the ProSupport Suite and continues with the unification of our ProDeploy Suite of services for deploying Dell EMC hardware—servers, storage, data protection and networking infrastructure—across the data center—and across data centers worldwide.

Don’t Risk Doing It Yourself

It’s a simple fact: every new product must be deployed by someone.

As the pace of technology changes and pressure to “do more with less” grows, many IT organizations are revisiting the cost/reward calculations of in-house deployment.

 

By standardizing and unifying deployment services with ProDeploy Suite, we can leverage—and pass along—consistency and economy-of-scale advantages that reduce time, cost and complexity.

We invest in new service offer development hand-in-hand with the design of new infrastructure solutions. We continually refine our deployment skills, tools and best practices to put the latest innovations fully to work, quickly and smoothly.

How Do You Define “Ready?”

Our global service delivery experience has shown us that different organizations and projects require different levels of deployment to be “ready” on day one. Ready can mean anything from racking equipment to installing a baseline configuration all the way to implementing a fully enabled solution.

That’s why ProDeploy Suite services are designed to deliver three different levels of functional, operational and production readiness—with services from basic hardware installation, to planning and configuration, to software installation, and knowledge transfer and training.

 

 

 

 

 

 

 

Realizing a Complete Service Experience

Our suites, ProSupport and ProDeploy, are easily combined with other Dell EMC services, such as Residency and Intelligent Data Mobility. Together they provide the advantages of simplicity and consistency—with the additional expertise, coordination and support needed to achieve specific objectives whether in a local facility or on a global scale.

Our Journey Together Doesn’t End Here

The ProDeploy Suite is the next step—but not the last—in our journey to offer easy-to-understand and easy-to-consume services—flexible enough to meet your needs, but without the expense and delay often associated with custom services.

As we continue on our journey, our objective is to build reliability and agility into our services that accelerate customer IT transformation—from the edge, to the data center, to the cloud, to the multi-cloud.

Our success depends on matching customers to the right deployment service to deliver on the potential of our Dell EMC infrastructure solutions—across the IT lifecycle.

So let us hear from you by commenting below!

*Based on”Bring new systems to production readiness faster and with less effort from in-house administrators,” a Principled Technologies Test Report commissioned by Dell, February 2017. Full report found here: http://facts.pt/YU95pg
**Based on a July 2017 internal analysis of support data from February 2016 through June 2017 for Dell PowerEdge, Dell Networking, and Dell SCv/PS/PowerVault Storage devices. Actual results may vary.

The post Why Hardware Deployment Is the Unsung Hero in Your IT Transformation appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/anna_le/why-hardware-deployment-is-the-unsung-hero-in-your-it-transformation/feed/ 0
InFocus Year in Review: Top Trending Content of 2017 https://infocus.dellemc.com/casey_may/infocus-year-in-review-top-trending-content-of-2017-2/ https://infocus.dellemc.com/casey_may/infocus-year-in-review-top-trending-content-of-2017-2/#respond Mon, 12 Feb 2018 10:00:53 +0000 https://infocus.dellemc.com/?p=33892 Digital transformation, the increasingly-present catalyst for universal change raced ahead in 2017, leaving no industry immune. Fortunately, InFocus thought leadership stayed on top of technology’s rapid growth leveraging our real world experiences across the six customer-centric categories in which our bloggers write: Big Data, Cloud, Technology, Service Excellence, Learning and Data Protection. In each of […]

The post InFocus Year in Review: Top Trending Content of 2017 appeared first on InFocus Blog | Dell EMC Services.

]]>
Digital transformation, the increasingly-present catalyst for universal change raced ahead in 2017, leaving no industry immune.

Fortunately, InFocus thought leadership stayed on top of technology’s rapid growth leveraging our real world experiences across the six customer-centric categories in which our bloggers write: Big Data, Cloud, Technology, Service Excellence, Learning and Data Protection.

In each of these disciplines, bloggers discussed how organizations need to adapt and transcend disruptive technology shifts – namely via Big Data, Cloud and IOT – into each and every way their business is conducted. From harnessing new skills to interacting with their own customers and external markets in opportune and growth-promising ways to changing the very foundation of the enterprise’s infrastructure.

Let’s take a look at this year’s most socially shared content per category by you, the InFocus reader.


Big Data


Difference between Big Data and Internet of Things by Bill Schmarzo

Bill discusses the differences between Big Data and IOT, the ‘IOT analytics challenge’ and IOT’s 3-tier architecture. Is your organization lacking an IOT Analytics strategy? Bill provides a framework and step-by-step guidelines to successfully launch your IOT  journey.

 

Is Data Science Really Science? by Bill Schmarzo

Bill discusses predictive indicators and behavioral insights harvested from data and analytics that enable an organization to base and alter decisions. The argument? Why react to customer trends, likes and dislikes in real-time when you can anticipate the patterns and effectively realign your strategy proactively?

 

5 Steps to Building a Big Data Business Strategy by Bill Schmarzo

Bill’s mantra ‘used well, Big Data changes the basis of competition in industry after industry’ incites organizations to build a business strategy that incorporates Big Data using a proven 5-Step approach.

See Bill’s Top 2017 Big Data, Data Science and IOT Blogs here.


Cloud


ROI of Private Cloud: Key Measurements and Metrics by Norman Dee

Norman discuss the 4 key concepts in projecting ROI – the costs, savings, NPV and Payback Period – and by way of illustration, provides detailed analyses of 3 organizations looking to move to private cloud.

 

NFV Operating Models “How to Mix Oil and Water (also known as IT Operations and Network Operations) by Laddie Suk

Laddie talks about Network Function Virtualization (NFV) and Virtualized Network Functions (VNFs) infrastructures, dismissing former respective ownership and role responsibilities. Can oil and water mix? Laddie advises designing a blended organization to operate NFV and provides step-by-step details on its planning and successful execution.

 

5 Lessons Learned from Enterprise-scale IT Transformations by Barb Robidoux

Can you benefit from the ‘lessons learned’ in on-the-ground execution of enterprise-scale IT transformations? Barb discusses why business-enabling IT transformation happens only when the gap between strategy and implementation is bridged, and weaves in Tom Roloff’s Top 5 Takeaways on IT Transformation for foolproof planning and execution.


Technology


Disrupt, or Else: Top 5 Digital Transformation Considerations by Alan Walsh

Alan discusses the vital importance for digitizing businesses from the top-down and the key themes for the digitization journey, including effective strategy and execution, agility, remaining customer-focused and embracing cultural change.

 

3 Key Elements to Database as a Service (DBaaS) by Haroon Qureshi

How does DBaaS fit into my current environment and organization? Haroon not only discusses the unprecedented level of functional and business benefits in transforming to DBaaS, but its impact to people, processes and technology.

 

How My Navy Career Informs My Work at Dell EMC by Josh Klein

As a client solutions leader, Josh discusses the process of identifying an organization’s appropriate IT transformation strategy and the importance of communications with and between stakeholders to ensure the vision will become a reality.


Service Excellence


Finally, PC Imaging that Doesn’t Require Time Travel by John Moody

Does one have to embark on a 120-year space travel journey to come by the perfect imaging tool and processes? Heck, no, says John who presents Dell EMC’s ProDeploy Plus and ImageAssist as the solution for an organization’s infinite imaging demands.

 

4 Questions to Ask Before Starting a Data Migration by Jim Donovan

The crucial component to a successful migration relies on those people and partners who are proven industry experts and well-versed in moving data. So how can you distinguish the amateurs from the pros? Jim provides four considerations to help you make the decision.

 

Say Goodbye to Labor Intensive, Hello to Light-Touch Deployment for PCs by John Moody

“To sit back and let fate play its hand out and never influence it is not the way man was meant to operate.” It’s a quote from the late U.S Senator John Glenn through which Moody conveys his enthusiasm for ProDeploy Client Suite, a product offering that enables Dell EMC customers’ administrators and service partners to deploy PCs with greater speed and efficiency like never before.


Learning


Step into a Learning Zone to Realize Your Full Potential by Kei Tsuda

How does the Learning Zone differ from the Performance Zone? Kei distinguishes between the two and challenges us to proactively pursue training from various online learning platforms and attending company events and seminars to realize our full potential – whether or not mandated by our work environment.

 

What does it mean to be an Entrepreneur? by Bill Schmarzo

What are the core set of characteristics that differentiate entrepreneurs from others? Bill tells us one of those characteristics is ‘impassioned entrepreneurial thinking,’ a quality that leads an individual to challenge commonly held truths. “When you approach things from the entrepreneurial angle, life becomes very exciting and fun,” says Bill.

 

Earn Your Dell EMC VMware Co-Skilled Digital Badge by Kei Tsuda

By what means can you expand your knowledge and skillset, and increase your value to your organization as a trusted advisor? The Dell EMC VMware Co-Skilled Digital Badge, that’s what. Kei suggests earning the no-cost badge to differentiate yourself from others as a recognized expert in innovative hyper-converged infrastructure solutions from Dell EMC and VMware.


Data Protection


Can the Internet Ever Be Secure? by David Edborg

David discusses the substantial and escalating cost to cybercrime, why cybercrime is the perfect ‘ghost crime’ and how organizations can implement Blockchain technologies to not only protect the information we don’t want accessed, but identify the assailant.

 

Transforming Security for Our Next Generation Systems by David Edborg

Can an organization overcome feeling vulnerable to cyberattack by composing a security architecture with 190 products? 290? David discusses the weaknesses in the high-level 5 functions of the Cybersecurity Framework (CSF) and in turn, how Dell Technologies delivers security essentials that provide real-time threat detection, business context and the management needed to keep valuable assets safe and business innovating.


Summary: ‘Disrupt or Be Disrupted’


In 2017, our record-breaking number of readers gravitated towards content and thought leadership on IT and Digital Transformation that’s shaping our present and imminent future—advances and knowledge about Big Data, IOT, Cloud, Cybersecurity, and how to build strategy and business models leveraging the superhighway of Data and Analytics.

Although none of our blogs on Virtual Reality and Artificial Intelligence were featured here in the 2017 recap, VR/AR and AI were topics poignantly discussed. These topics will grow more prominent as part of our InFocus presentation in 2018.

Please remain engaged with InFocus as our bloggers lead us into this New Year with continued compelling discussion across our six customer-centric categories, impart advice on how to ‘stay digitally fit and relevant,’ and offer expert counsel on the next generation of IT breakthroughs.

We are excited for what’s to come and thank you.


Also a special thanks to the backstage crew who helped drive a successful 2017.

Ant Gozkaya

Social media lead, analytics and optimization

 

Lisa Mae DeMasi

Content management and social contributor

 

Rachel Mclean

Content, social, operations, and special projects.

 

The post InFocus Year in Review: Top Trending Content of 2017 appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/casey_may/infocus-year-in-review-top-trending-content-of-2017-2/feed/ 0
Bridging the Operational Gap to Accelerate NFV Deployment https://infocus.dellemc.com/laddie_suk/bridging-operational-gap-accelerate-nfv-deployment/ https://infocus.dellemc.com/laddie_suk/bridging-operational-gap-accelerate-nfv-deployment/#respond Wed, 07 Feb 2018 10:00:17 +0000 https://infocus.dellemc.com/?p=34043 New case study-based white paper offers valuable insights for CSPs on  accelerating NFV deployment and achieving operational excellence. Welcome to the first of a 4-part blog series on NFV deployments.The blogs will provide additional perspective on our newly released white paper, which brings the operational challenges of NFV deployments to life through three real-world examples – […]

The post Bridging the Operational Gap to Accelerate NFV Deployment appeared first on InFocus Blog | Dell EMC Services.

]]>
New case study-based white paper offers valuable insights for CSPs on  accelerating NFV deployment and achieving operational excellence.

Welcome to the first of a 4-part blog series on NFV deployments.The blogs will provide additional perspective on our newly released white paper, which brings the operational challenges of NFV deployments to life through three real-world examples – detailing both where they went awry and how applying a new agile methodology would have mitigated it. I encourage you to check it out.

In 2016, AT&T Labs predicted that adopting Network Functions Virtualization (NFV) and Software Defined Networking (SDN) would save the company a whopping 40-50% in OPEX in coming years.

One year into its 3-year journey to virtualize 75% of its network by 2020, the company reports significant cost savings – and that’s not all. Some 1,700 businesses are using new AT&T FlexWare NFV/SDN devices and services to set up multiple Virtual Network Functions (VNFs), such as bandwidth management, virtual routers, firewalls, and other security applications on AT&T’s managed network.

Not Just a Business Opportunity—Business Survival

Like AT&T, mobile and fixed-line Communications Service Providers (CSPs) are investing in NFV/SDN to reduce CAPEX and OPEX and enable new kinds of revenue-producing services. In fact, SNS Research estimates that CSP NFV/SDN investments will rise to nearly $22 Billion by the end of 2020.

But the business case of NFV/SDN goes beyond top- and bottom-line improvements. As the following chart illustrates, CSP business survival depends on quick migration to a new business and cost models capable of delivering new kinds of added-value, on-demand services at web scale. In short, CSPs must emulate cloud service provider models to be able to compete against them and other emerging market entrants.

Why NFV/SDN?

With NFV/SDN, CSPs can leverage agility, dynamic scaling, and efficiency advantages similar to those that have been realized in IT through infrastructure virtualization and software-defined technologies.

Operators gain the flexibility to efficiently shift network capacity to where it is most needed, in order to deliver the new kinds of services that customers are demanding. It enables CSPs to keep pace with rapidly evolving technologies, by allowing changes to underlying infrastructure without impacting customer experience.

Arrested Deployment

Despite the compelling business case and investments in NFV/SDN to-date, CSPs have been slow to reap expected benefits. Often the trouble is in moving from proof-of-concept and lab evaluations into field trials and commercial deployments.

Research outlined in a new white paper on accelerating NFV deployment finds that a major obstacle to the successful deployment of dynamic, software-controlled network functions has been operational – specifically, legacy domain- and node-based silos of IT and network management that lack the service-oriented processes, skillsets and organizational structures and metrics to successfully deploy and operate NFV/SDN.

Bridging the IT/Network Operations Gap

Having personally, in my past professional life, led both Network Operations responsible for core network functions – and IT Operations responsible for the IT systems that network organizations depend on – I can attest that:

  • Successful NFV/SDN service operations require both IT and network expertise, applied in new ways
  • NFV/SDN operations challenge both: traditional IT operations and traditional network operations
  • The gap between IT and network operations is both practical and cultural – each is accustomed to dealing with different kinds of workloads/applications, resiliency mechanisms, regulatory requirements and ecosystems (e.g., (OSS/BSS vs. ERP); and each perceives that the other “goes about things in the wrong way”

Read All About It

To delve deeper into these issues – and learn about agile approaches to operating model transformation critical to NFV/SDN success – I invite you to download our new Dell EMC white paper Bridging the IT/Network Operations Gap to Accelerate NFV Deployment and Achieve Operational Excellence

Based on research with MST Consulting and Dell EMC experience with than 2,000 successful cloud implementations, the paper describes methodologies that can mitigate CSP challenges and includes NFV case studies that can help CSPs avoid missteps and gain a new perspective on the right next step for their company.

Next time:  Failure to Migrate: A Case Study in NFV Deployment


Join Us At Mobile World Congress in Barcelona Feb 26 – March 1

Visit us in Hall 3, Booth 3K10 for a more in-depth discussion of NFV/SDN and other topics important to you. Or contact Larry Rouse (larry.rouse@dell.com) to set up a meeting ahead of time.


Today’s Travel Tip

For those headed to Barcelona, here are my top 4 must-see sights:

  • La Sagrada Familia – Gaudi’s spectacular basilica that has been under construction since 1882
  • The Gothic Quarter with the Barcelona Cathedral
  • Picasso Museum – also in the Gothic Quarter
  • Las Ramblas – for shopping, restaurants or just people watching

The post Bridging the Operational Gap to Accelerate NFV Deployment appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/laddie_suk/bridging-operational-gap-accelerate-nfv-deployment/feed/ 0
Measuring What Matters: An Enterprise DevOps Perspective on IT Performance https://infocus.dellemc.com/bart_driscoll/measuring-what-matters-an-enterprise-devops-perspective-on-it-performance/ https://infocus.dellemc.com/bart_driscoll/measuring-what-matters-an-enterprise-devops-perspective-on-it-performance/#respond Tue, 06 Feb 2018 10:00:49 +0000 https://infocus.dellemc.com/?p=33925 Performance and productivity metrics are important because they give us the needed information to shape or reshape behavior. They provide the feedback and insight to continually improve processes and practices toward our common goal, namely creating value for the customer. Unfortunately in many large IT shops, performance metrics are not aligned toward this common goal […]

The post Measuring What Matters: An Enterprise DevOps Perspective on IT Performance appeared first on InFocus Blog | Dell EMC Services.

]]>
Performance and productivity metrics are important because they give us the needed information to shape or reshape behavior. They provide the feedback and insight to continually improve processes and practices toward our common goal, namely creating value for the customer.

Figure 1: The DevOps Scorecard

Unfortunately in many large IT shops, performance metrics are not aligned toward this common goal and rather reflect the objectives and scorecards of single departments or practices. For example, does a customer care about Uptime? No. Customers/users expect applications and the systems they run on to be up. They don’t care if it has been running for 1 day or 364 days without interruption. Yet uptime (aka the five 9’s) is often a key success metric for System Administrators and IT shops at-large.

Nice work keeping the servers running,” said no CEO ever.

Too often, we focus on IT-centric measures, like uptime, rather than on customer or user success measures to evaluate our performance. I am not suggesting uptime is not important rather if you approach this metric from the customer point-of-view, you will quickly see the measure of uptime is not really valuable to the customer nor does it provide any real value to the organization regarding performance.

To keep it simple, think of your car or truck. When you bring it to a garage would you want the mechanic or service manager to tell you how many days, hours, and minutes you drove it without incident before bringing it into the shop? No. You don’t care. Would that data be valuable to the car dealership or manufacturer? Does it provide actionable data? I would argue no.

But in IT, we think Uptime is a critical, key measure. We bonus people on maintaining uptime levels. We spend time and money capturing, collecting, transforming, and reporting on that data. Yet, it isn’t adding value to either our customer or our own performance.

Borrowing a page from DevOps, we know that flow-based, event-driven metrics are critical to measuring and reporting on the performance of teams developing and deploying applications and infrastructures. Flow-based, event-driven metrics help IT teams answer critical performance questions from the customer perspective. They provide feedback on value creation processes and practices like:

  • How quickly can a request be fulfilled?
  • How quickly can a new or updated capability, function, or service be delivered?
  • How quickly can the system recover from and an issue or outage?
  • How likely is a customer to fail taking an update from you?

These customer-centric questions easily translate into performance measures such as success rate, change cycle time, mean-time-to-recover (MTTR), and release frequency. Additionally, all four of these metrics are directly actionable.

Figure 2: Baseline your performance against industry

For example, if your Uptime metric is missed you then need to generate a new report highlighting downtime. More specifically, you need to inspect downtime to understand why it happened (success rate): why it took so long to recover (MTTR); and, why it took so long to repair (change cycle time).

It is these event-based metrics that provide the insights and data to improve performance. If success rate is low, you can evaluate your quality, verification, and validation processes to understand how and why issues are missed before hitting Production. If your recovery time is too long, you can evaluate the processes of deployment, rollback, and failover to improve adherence to known-good packages and standards. If cycle time is too long, you can evaluate your change management and development processes to accelerate responsiveness and agility.

Furthermore, if you do measure these flow-based, event-driven metrics you will be indirectly managing uptime. If your success rate is high and your recovery time is low, then by default your uptime is high.

And don’t forget the customer, these flow-based, event-driven metrics also correlate to customer satisfaction and value. If IT is responsive to customer requests, timely in its delivery, confident in quality, and quick to recover if an issue does occur, then customers will be more satisfied with the services provided. This is corroborated by the annual State of DevOps Report that regularly suggests high performing IT teams are twice as likely to over-achieve on enterprise profitability, market share, and productivity goals.

So, where does this data come from?

Flow-based, event-driven performance metrics are derived from data generated by continuous delivery (CD) pipelines. Event and application data combined with logs from various tools along the workflow capture key measures that reflect real-time pipeline performance.

For example, an automated test tool, like unittest, will generate a success flag, audit trail, as well as an elapsed time count. This data combined with data from other applications across the tool chain is aggregated by a change or release candidate. Together this data illustrates the success rate and cycle time of the end-to-end process for a single change or release candidate. Change/release data can be further combined to illustrate trends at the application, program, portfolio, or enterprise level.

Granular, real-time data surfaced to teams provides them with the needed information to act. This data can inform a team early that there is a problem with a given release, or that their process is too slow. Furthermore, it points directly to the constraint area or issue allowing the team to quickly swarm the problem.

Employing this proactive measurement model requires a shift in how we design, build, and report metrics from both a technological and business perspective. It requires a clear understanding of desired enterprise outcomes, collaboration across IT and business functions, and modern architectures to be successful. For a deep dive on how to build proactive monitoring systems, I recommend the book The Art of Monitoring by James Turnbull.

Summary

Our experience in helping customers transform IT using DevOps principles and practices has hardened our resolve to the importance flow-based, event-driven performance metrics. Without these metrics it is impossible to prove success with DevOps and more so impossible to understand where to best act next. Metrics are the language of the executive.

If we want to transform the enterprise we need to use a language they understand. IMO, flow-based, event-driven performance metrics are the key.

Recently, I have been working with the thought-leaders at DevOps Research and Assessment (DORA). They have incorporated these customer-focused performance metric into an online assessment tool, also called DORA. The tool not only maps these metrics but also identifies 20 known IT capabilities proven to drive enterprise performance. Check out the assessment and please note that measurement and monitoring are considered a key competency there too.

Watch DORA CEO and Chief Scientist Dr. Nicole Forsgren and CTO Jez Humble’s research program presentation, “What We Learned from Three Years of Sciencing the Crap out of DevOps.”

Sources:

Figure 1: The DevOps Scorecard

Figure 2: DevOps Research & Assessment

The post Measuring What Matters: An Enterprise DevOps Perspective on IT Performance appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/bart_driscoll/measuring-what-matters-an-enterprise-devops-perspective-on-it-performance/feed/ 0
How State Governments Can Protect and Win with Big Data, AI and Privacy https://infocus.dellemc.com/william_schmarzo/how-state-governments-can-protect-and-win-with-big-data-ai-and-privacy/ https://infocus.dellemc.com/william_schmarzo/how-state-governments-can-protect-and-win-with-big-data-ai-and-privacy/#respond Tue, 06 Feb 2018 10:00:18 +0000 https://infocus.dellemc.com/?p=33527 I was recently asked to conduct a 2-hour workshop for the State of California Senior Legislators on the topic of “Big Data, Artificial Intelligence and Privacy.” Honored by the privilege of offering my perspective on these critical topics, I shared with my home-state legislators how significant opportunities await the state. I reviewed the once-in-a-generation opportunities […]

The post How State Governments Can Protect and Win with Big Data, AI and Privacy appeared first on InFocus Blog | Dell EMC Services.

]]>
I was recently asked to conduct a 2-hour workshop for the State of California Senior Legislators on the topic of “Big Data, Artificial Intelligence and Privacy.” Honored by the privilege of offering my perspective on these critical topics, I shared with my home-state legislators how significant opportunities await the state. I reviewed the once-in-a-generation opportunities awaiting the great State of California (“the State”), where decision makers could vastly improve their constituents’ quality of life, while creating new sources of value and economic growth.

Industrial Revolution Learnings

We have historical experiences and references to revisit in discerning what the government can do to nurture our “Analytics Revolution.” Notably, the Industrial Revolution, holds many lessons regarding the consequences of late and/or confusing government involvement and guidance (see Figure 1).

Figure 1: Lessons from the Industrial Revolution

 

Government’s role in the “Analytics Revolution” is clear: to carefully nurture and support industry, university, and government collaboration to encourage sustainable growth and prepare for massive changes and opportunities. The government can’t afford to stand by and let the markets decide. By the time the markets have decided, it may be too late to redirect and guide resources, especially given the interests of Russia and China in this all-important science.

Be Prepared to Action on the Nefarious

Access to sensitive information, data protection, privacy – these are all hot button issues with the citizenry. The State must be aware of the society and cultural risks associated with the idea of a “Big Brother” shadowing its people. The State must champion legislation in cooperation with industry in order to protect the masses, while not stifling creativity and innovation. That’s a tough job, but the natural conflict between “nurturing while protecting” is why the government needs to be involved early. Through early engagement, the State can then reduce concern between industrial growth and personal privacy.

The “Analytics Revolution” holds tremendous promise for the future of industry and personal achieve, but will require well-defined rules of conduct and engagement. Unsupervised growth or use may lead to information being exploited in nefarious ways with potentially damaging results.

The State must protect its constituents’ sensitive information while nurturing the industrial opportunity. That’s a tall order, but nothing less should be expected from our government, industry and society leaders.

Can’t Operate in a World of Fear

We can’t be afraid of what we don’t know. The State must increase constituents’ awareness and education of Big Data and Artificial Intelligence; what they are, what they are used for and the opportunities locked within including “The Good, the Bad, and the Ugly.”

We can’t operate in a world of fear; jump to conclusions based upon little or no information, or worse yet, misinformation or purposeful lies. Government leaders must collaborate with industry and universities to actively gain understanding of the true ramifications and capabilities of Big Data and Artificial Intelligence, before they create legislation (see Figure 2).

Figure 2: Government Leaders Must Seek Information before Jumping to Conclusions

 

It’s because I’m an educator in this field that I was so honored to be part of this discussion. In addition to discussing the economic opportunities that lie within Big Data and Artificial Intelligence, I wanted to help our legislators understand they should prioritize their own learning and education of these sciences before enacting rules and regulations.

Predict to Prevent

The opportunities for good are almost overwhelming at the government level! Whether in education, public services, traffic, fraud, crime, wild fires, public safety or population health, Big Data and Artificial Intelligence can dramatically improve outcomes while reducing costs and risks (see Figure 3).

Figure 3: Big Data and AI Reducing Crop Loss to Diseases

 

However, to take advantage of the potential of Big Data and Artificial Intelligence, The State, its agencies, and its legislators need to undergo a mind shift. They need to evolve beyond “using data and analytics to monitor agency outcomes” to understanding how to “leverage data and analytics to Predict, to Prescribe and to Prevent!”  That is, these organizations need to evolve from a mindset of reporting what happened to a mindset of predicting what’s likely to happen and prescribing corrective or preventative actions or behaviors (see Figure 4).

Figure 4: The “Predict to Prescribe to Prevent” Value Chain

 

There are numerous use cases of this “predict to prevent” value chain that will not only benefit state agencies’ operations, but also have positive and quality of life ramifications to the residents of California including the opportunity to prevent:

  • Hospital acquired infections
  • Crime
  • Traffic Jams / vehicle accidents
  • Major road maintenance
  • Cyber attacks
  • Wild fires
  • Equipment maintenance and failures
  • Electricity and utility outages
  • And more…

Role of Government

The role of government is to nurture, not necessarily to create, especially in California. California is blessed with a bounty of human capital resources including an outstanding higher education system and an active culture of corporate investing such as the Google $1B AI Fund (see “Google Commits $1 Billion In Grants To Train U.S. Workers For High-Tech Jobs”).

There is a bounty of free and low-cost Big Data and Artificial Intelligence training available. For example, Andrew Ng, one of the world’s best-known artificial-intelligence experts, is launching an online effort to create millions more AI experts across a range of industries. Ng, an early pioneer in online learning, hopes his new deep-learning course on Coursera will train people to use the most powerful idea to have emerged in AI in recent years.

California sits in rarified air when it comes to the volume of natural talent in the Big Data and Artificial Intelligence spaces. The State should seize on these assets, coordinate all of these valuable resources and ensure that this quality and depth of training is available to all.

State of California Summary

In summarizing what I told my audience, Big Data and Artificial Intelligence provide new challenges, but the opportunities for both private and public sectors are many. To harness the power of Big Data and AI, the State should focus on:

  • Minimizing impact of nefarious, illegal and dangerous activities
  • Balancing Consumer value vs. Consumer exploitation
  • Addressing inequities in data monetization opportunities
  • Re-tooling / Re-skilling the California workforce
  • Fueling innovation via university-government-business collaboration
  • Adopt regulations for ensuring citizen/customer fairness (share of the wealth)
  • Providing incentives to accelerate state-wide transformation and adoption

Figure 5: Threats to the California “Way of Life”

 

It is up to everyone — the universities, companies, and individuals — to step up and provide guidance to our government and education leaders to keep California at the forefront of our “Analytics Revolution.” This is one race where there is no silver medal for finishing second.

The post How State Governments Can Protect and Win with Big Data, AI and Privacy appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/how-state-governments-can-protect-and-win-with-big-data-ai-and-privacy/feed/ 0
The Consumerization of Artificial Intelligence https://infocus.dellemc.com/william_schmarzo/the-consumerization-of-artificial-intelligence/ https://infocus.dellemc.com/william_schmarzo/the-consumerization-of-artificial-intelligence/#respond Wed, 31 Jan 2018 10:00:05 +0000 https://infocus.dellemc.com/?p=33512 Consumerization is the design, marketing, and selling of products and services targeting the individual end consumer. Apple CEO Tim Cook recently promoted a $100-per-year iPhone app called Derm Expert. Derm Expert allows doctors to diagnose skin problems using only their iPhone.  Doctors take a photo of a patient’s skin condition and then Derm Expert diagnoses […]

The post The Consumerization of Artificial Intelligence appeared first on InFocus Blog | Dell EMC Services.

]]>
Consumerization is the design, marketing, and selling of products and services targeting the individual end consumer.

Apple CEO Tim Cook recently promoted a $100-per-year iPhone app called Derm Expert. Derm Expert allows doctors to diagnose skin problems using only their iPhone.  Doctors take a photo of a patient’s skin condition and then Derm Expert diagnoses the problem and prescribes treatment. Doctors can effectively treat patients without a high performance computer or an expensive technology environment. They just need the same iPhone that you and I use every day.

Figure 1: Derm Expert App

 

Derm Expert makes use of Apple’s Core ML framework that is built into all new iPhones. Core ML makes it possible to run Machine Learning and Deep Learning algorithms on an iPhone without having to upload the photos to the “cloud” for processing.

Apple is not the only company integrating Machine Learning and Deep Learning frameworks into their products, but it may be the first company to put such a powerful capability into the hands of millions of consumers. Whether we know it or not, we have all become “Citizens of Data Science,” and the world will never be the same.

Embedding Machine Learning Frameworks

Apple Core ML in the iPhone is an example of how industry leaders are seamlessly embedding powerful machine learning, deep learning, and artificial intelligence frameworks into their development and operating platforms. Doing so enables Apple IOS developers to create a more engaging, easy-to-use customer experience, leveraging Natural Language Processing (NLP) for voice-to-text translation (Siri) and Facial recognition. Plus, it opens the door for countless new apps and use cases that can exploit the power of these embedded frameworks.

Core ML enables developers to integrate a broad variety of machine learning algorithms into their apps with just a few lines of code. Core ML supports over 30 deep learning (neural network) algorithms, as well as Support Vector Machine (SVM) and Generalized Linear Models (GLM)[1].

For example,

  • Developers can integrate computer vision machine learning features into their app including face tracking, face detection, landmarks, text detection, rectangle detection, barcode detection, object tracking and image registration.
  • The natural language processing APIs in Core ML use machine learning to decipher text using language identification, tokenization, lemmatization and named entity recognition.

Core ML supports Vision for image analysis, Foundation for natural language processing, and GameplayKit for evaluating learned decision trees (see Figure 2).

Figure 2: Core ML Is Optimized for On-Device Performance, Which Minimizes Memory Footprint and Power Consumption

 

Machine Learning and Deep Learning Microprocessor Specialization

Artificial intelligence, machine learning and deep learning (AI | ML | DL) require massive amounts of computer processing power. And while the current solution is just to throw more processors at the problem, eventually that solution won’t scale as the processing needs and the volume of detailed, real-time data increase3.

One of the developments leading to the consumerization of artificial intelligence is the ability to exploit microprocessor or hardware specialization. The traditional Central Processing Unit (CPU) is being replaced by special-purpose microprocessors built to execute complex machine learning and deep learning algorithms.

This includes:

  • Graphics Processing Unit (GPU): a specialized electronic circuit designed to render 2D and 3D graphics together with a CPU. It is also known as a graphics card in the gamer’s culture. Now GPUs are being harnessed more broadly to accelerate computational workloads in areas such as financial modeling, cutting-edge scientific research, deep learning, analytics, and oil and gas exploration etc.
  • Tensor Processing Unit (TPU): a custom-built integrated circuit developed specifically for machine learning and tailored for TensorFlow (Google’s open-source machine learning framework).TPU is designed to handle common machine learning and neural networking calculations for training and inference, specifically: matrix multiply, dot product, and quantization transforms. On production, AI workloads that utilize neural network inference, the TPU is 15 times to 30 times faster than contemporary GPUs and CPUs, according to Google.

Intel is designing a new chip specifically for Deep Learning called the Intel® Nervana™ Neural Network Processor (NNP)[4]. The Intel Nervana NNP supports deep learning primitives such as matrix multiplication and convolutions. Intel Nervana NNP enables better memory management for Deep Learning algorithms to achieve high levels of utilization of the massive amount of compute on each die.

The bottom-line translates to achieving faster training time for Deep Learning models.

Finally, a new company called “Groq” is building a special purpose chip that will run 400 trillion operations per second, more than twice as fast as Google’s TPU[5].

What do all these advancements in GPU and TPU mean to you the consumer?

“Smart” apps that leverage these powerful processors and the embedded AI | ML | DL frameworks to learn more about you to provide a hyper-personalized, prescriptive user experience.

It’ll be like a really smart, highly attentive personal assistant on steroids!

The Power of AI in Your Hands

Unknowingly over the past few years, artificial intelligence worked its way into our everyday lives. Give a command to Siri or Alexa and AI kicks in to translate what you said and look up answer. Upload a photo to Facebook and AI identifies the people in the photo. Enter a destination into Waze or Google Maps and AI provides updated recommendations on the best route. Push a button and AI parallel parks your car all by itself (dang, where was that during my driver’s test!).

With advances in computer processors and embedded AI | ML | DL frameworks, we are just beginning to see the use cases. And like the Derm Expert app highlights, the way that we live will never be the same.

Sources:

[1]Build More Intelligent Apps With Machine Learning

Figure 2: Core ML

[3]Are limitations of CPU speed and memory prevent us from creating AI systems

[4]Intel® Nervana™ Neural Network Processors (NNP) Redefine AI Silicon

[5]Groq Says It Will Reveal Potent Artificial Intelligence Chip Next Year

The post The Consumerization of Artificial Intelligence appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/the-consumerization-of-artificial-intelligence/feed/ 0
How the Eagles and Patriots Can Avoid a Championship Let Down: Play the Percentages https://infocus.dellemc.com/william_schmarzo/how-the-eagles-and-patriots-can-avoid-a-super-bowl-lii-let-down-play-the-percentages/ https://infocus.dellemc.com/william_schmarzo/how-the-eagles-and-patriots-can-avoid-a-super-bowl-lii-let-down-play-the-percentages/#respond Mon, 29 Jan 2018 10:00:33 +0000 https://infocus.dellemc.com/?p=33945 It’s Super Bowl LII week and the Philadelphia Eagles and New England Patriots are on the precipice of a championship. One game to decide it all, where one decision could swing the fortunes of either team, depending on a single play call. What can Bill Belichick or Doug Pederson do to avoid a letdown or […]

The post How the Eagles and Patriots Can Avoid a Championship Let Down: Play the Percentages appeared first on InFocus Blog | Dell EMC Services.

]]>

Figure 1: ESPN NFL Gamecast

It’s Super Bowl LII week and the Philadelphia Eagles and New England Patriots are on the precipice of a championship. One game to decide it all, where one decision could swing the fortunes of either team, depending on a single play call. What can Bill Belichick or Doug Pederson do to avoid a letdown or propel his team to victory?

It’s simple – play the percentages.

We only need to look back one year to observe how a coach’s thought process may have prevented his team from claiming a Super Bowl title.

It was Super Bowl LI. Tom Brady had just thrown an interception that the Atlanta Falcons returned for a touchdown. Atlanta held a 21 – 0 lead with 2:21 left in the first half. By ESPN’s projections, the Falcons at this point in the game had a 96.6% probability of winning.

Jump to the third quarter, the Patriots, then trailing 28-3, faced fourth-and-3 at their own 46-yard line with 6:04 on the clock. Tom Brady completed a pass to Danny Amendola for 17 yards. This single play, yielding a Patriots first down, extended the Patriots offensive drive and increased their “Win Expectancy” from 0.2 percent to 0.5 percent (+0.3 percent increase).

It wasn’t until the Patriots scored with 57 seconds remaining in the game to force overtime that they rose from handicappers’ deathbed and evened the game’s win probabilities (see Figure 2). The Patriots ultimately won the game in overtime, overcoming seemingly insurmountable odds.

Understanding Probabilities to Win

How did the New England Patriots achieve such an unlikely comeback? Or maybe more relevant – how could the Atlanta Falcons commit such a mind-boggling, unprecedented choke job?

Figure 2: Patriots versus Flacons Win Expectancy Super Bowl LI

 

Let’s look to the card table to learn how basic probabilities can help humans make better decisions. From “A Blackjack Pro Explains How Ignoring the Odds Cost the Falcons the Super Bowl”, each decision in blackjack can be dictated by simple probabilities. The average blackjack player loses about 3 percent of his or her money. However if probabilities are played correctly, the house’s edge reduces to about 0.5 percent.  Unfortunately, even when humans know the right action to take, very few people actually play the probabilities correctly because humans are overwhelmed with cognitive biases.

Like casinos’ algorithms that determine the odds and outcomes of everything from slot machines to roulette, NFL front offices would benefit from applying Machine Learning to analyze thousands of football games played over the past 10 years. They could then analyze all possible situations and calculate the probability of each outcome. Then all a coach needs to do is to follow the math. But like in blackjack, it can be hard to stay focused on a statistical-based strategy under the stress and excitement of the moment.

Up 28-9 with two minutes left in the third quarter, the Atlanta Falcons had a 99 percent chance to win Super Bowl LI, but then the Falcons ignored simple probabilities that compounded bad decisions:

  • First, Atlanta quarterback Matt Ryan did not let the play clock run down to fewer than 10 seconds on every Falcons offensive possession.
  • Second, by not running the football once they had a late lead (Falcons were gaining an above-average 5.8 yards per rush), they allowed the clock to stop on incomplete passes.

Both of these decisions gave the Patriots – and Tom Brady – more time to get back into the game. All Atlanta needed to do was execute a simple “run the ball” strategy and reduce the number of Patriots’ offensive possessions by one. Unfortunately for the Atlanta Falcons, their decision making was akin to hitting on a 15 in blackjack when the dealer had a six showing. The Falcons ignored basic probabilities and the result was the biggest turnaround in Super Bowl history…at their expense.

Humans Aren’t Good at Making Decisions

Human decision-making capabilities have evolved from millions of years of survival on the savanna. Necessity dictated that we become very good at recognizing patterns and making quick, instinctive survival decisions based upon those patterns (see the blog “Human Decision-Making in a Big Data World”).

Unfortunately, humans are lousy number crunchers. Consequently, humans have learned to rely upon heuristics, rules of thumb, anecdotal information, intuition and “gut” as our decision guides.

So what can Bill Belichick or Doug Pederson do to overcome our natural decision-making liabilities and avoid their teams from becoming the next Atlanta Falcons? It starts with acknowledging and understanding our inherent decision-making or cognitive bias flaws.

Awareness is the starting point and while I could easily write a book on the subject, let’s cover a few of the more common decision-making traps with recommendations on how to manage around these traps.

Type of Human Biases or Decision Traps

Trap: Over-confidence

Over-confidence is when a decision maker places a greater weight or value on what they know and assumes that what they don’t know isn’t important.

Corrective Action: The Falcons entered the Super Bowl with the second-ranked passing offense in the NFL in 2016, while also boasting the fifth-best running attack. However, when it became crunch time, Atlanta leaned on their passing game. To their detriment, they ignored their running attack, certain their MVP quarterback Matt Ryan could finish the job.

Had the Falcons’ coaching staff leveraged Machine Learning, they might have identified variables that might have provided better predictors of in-game performance (e.g. Tom Brady’s excellence in the 4th quarter, the Patriots late-game defensive tendencies), and avoided becoming overconfident in their passing game that wilted in the second half. As a standard operating practice, football front offices should apply Machine Learning to mine the large body of football game data to identify those “known unknowns” and “unknown unknowns” relationships buried in the data.

Trap: Anchoring Basis

An Anchoring Bias is a tendency to lock onto a single fact as a reference point for future decisions, even though that reference point may have no logical relevance to the decision at hand.

Corrective Action: The Falcons, having limited the Patriots to 215 yards plus two turnovers in the first half, had reason to feel good about their defense. However, one half does not make a football game. Buoyed by a first-half performance and a 21-3 halftime lead, the Falcons failed to adapt from what the first half showed them. The Patriots ran 23 plays over their final two first-half possessions, while Atlanta averaged 5 plays per drive in the first half despite scoring 14 offensive points (their third touchdown came via their defense). The length of the Patriots’ respective drives should have alerted Atlanta that New England was poised for greater offensive effectiveness in the second half.

Atlanta failed to identify, validate, vet and prioritize the most relevant in-game metrics to create a more effective second-half offensive game plan to keep the Patriots off the field and stymie their eventual rally.

Real-time advance metrics evaluation to monitor desired behaviors and outcomes is increasingly important to in-game success.

Figure 3: Human Design Making

Trap: Framing Effect

The Framing Effect is a cognitive bias in which a person’s decision is influenced by how the decision is presented. For example, humans tend to avoid risk when a positive frame is presented, but seek risks when a negative frame is presented.

Corrective Action: Into the 4th quarter, the Falcons’ banked on a passing game that achieved 12.3 yards per completed pass in the first half. Had they simply followed the math (and the inevitable “regression to the mean”), the Falcons would have run the ball in order to burn clock and denied the Patriots that one additional possession that ultimately decided the game.

NFL coaches may not be inclined to invest the time to carefully frame the in-game decisions or hypothesis. However had Atlanta – prior to the game – leveraged Design Thinking techniques to create in-game scoring tables and maps that would have charted the potential game flow, they could have referred to proven data, rather than rely on their gut instincts, to ensure they followed the most appropriate in-game decisions.

Trap: Risk Aversion

Risk aversion is the result of people’s preference for certainty over uncertainty and for minimizing the magnitude of the worst possible outcomes to which they are exposed.  For example, a “risk averse” investor prefers lower returns with known risks rather than higher returns with unknown risks.

Corrective Action: Again, in the 4th quarter, the Falcons turned risk aversion upside down. With nine pass attempts to four running attempts, they leaned on lower probability passing plays (passing the ball downfield is a lower probability of success than running) rather than the safely running the ball for first downs.

Falcons coaches did not take the time to understand the impact of Type I and Type II Errors of decisions under different in-game situations (e.g., kick-offs, punts, third-and-long, fourth down, 2 point conversions, overtime). They could have also applied Reinforcement Learning algorithms to create analytic models of different in-game scenarios that objectively balance rewards and risks around desired outcomes.

Trap: Sunk Costs

Sunk costs are retrospective costs that have already been incurred and cannot be recovered. Consequently, sunk costs should not factor into future decisions and should be ignored as if they never happened.

Corrective Action: Coaches in any sport rely heavily on tendencies and patterns. However every situation – even ones that mirror previous games – is unique. Making the same decision in a same situation in a different game does not guarantee the same outcome. It is a difficult habit for coaches to quit, but as the use of data science in sports increases and the use of in-game analytics grows, coaches must  ensure that sunk costs (i.e., previous in-game decisions that can’t be reversed) are identified and less influential to in-game decision making..

Trap: Endowment Effect

Endowment Effect is the hypothesis that people ascribe more value to things merely because they own them. We over-value what we have which leads to unrealistic expectations on price and terms (i.e., stock traders who become attached to a stock they own and consequently have trouble selling it).

Figure 4: The Endowment Effect

 

Corrective Action: The Falcons’ quarterback Matt Ryan was appearing in his first Super Bowl. The Patriots’ quarterback Tom Brady had a history of Super Bowl heroics. Did the Falcons coaches’ overconfidence in their quarterback cloud their judgment and reliance on key predictive performance variables (e.g., quarterback rating, yards after catch, effectiveness under pressure) to guide in-game decisions? Basing analytics models on flawed variables can lead to sub-optimal and even wrong decisions.

Trap: Confirmation Bias

Confirmation Bias is the tendency to interpret new evidence as confirmation of one’s existing beliefs or theories. Confirmation biases impact how people gather information, but they also influence how we interpret and recall information.

Corrective Action: Did the Falcons’ first-half performance confirm a belief that was proven false? In sports, momentum can lead to wild swings in outcome. Did excellence in executing their game plan during the first half, and resulting confirmation bias, lead the Falcons astray in the second half?

This is why sports teams are investing heavily in in-game predictive models with data scientists with expertise in other fields in order to avoid introducing personal biases into the analytic models. The partnership between data scientists, who focus on identifying the most predictive data and algorithms, and coaches, who are responsible for the in-game decisions, represents a new dynamic in the 21st century management of athletic teams.

Other Cognitive Biases of which to be aware include:

  • Herding (Safety in Numbers)
  • Mental Accounting
  • Reluctance to Own Mistakes (Revisionist History)
  • Confusing Luck with Skill
  • Regression to the Mean
  • Don’t Respect Randomness
  • Over-emphasize the Dramatic

Summary

Figure 5: Atlanta Falcons QB, Matt Ryan

Would a basic understanding of probabilities have saved the Atlanta Falcons from compounding a series of small but bad decisions into a painful loss? Maybe not, because understanding and acting are two different things. In the excitement of the moment, humans tend to forget what they’ve been taught and react instinctively.

Awareness is step one, but training is the ultimate solution. Decision makers need to be trained to “take a breath” and consult the models and numbers before rushing into a decision. Research shows that one of the keys to making “clear headed decisions” is to have a feeling of control. NASA and the Navy Seals accomplish that with repeated training [1].

Las Vegas is built on our inherent number crunching flaws and our inability to think with a clear head when the excitement, flashing lights, and pounding music is driving us to use our gut, not our brains, to make decisions.

Don’t think for a moment that those majestic casinos are built by giving away money to gamblers.

Sources:
Figure 2: Win Probability
Figure 4: The Endowment Effect
Figure 5: Photo courtesy of atlantafalcons.com
[1]The Secret to Handling Pressure Like Astronauts, Navy Seals, and Samurai
Additional Sources on Cognitive Biases:
20 Cognitive Biases That Screw Up Your Decisions
Cognitive Bias Codex

The post How the Eagles and Patriots Can Avoid a Championship Let Down: Play the Percentages appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/how-the-eagles-and-patriots-can-avoid-a-super-bowl-lii-let-down-play-the-percentages/feed/ 0
Bill Schmarzo’s Top 2017 Big Data, Data Science and IOT Blogs https://infocus.dellemc.com/william_schmarzo/bill-schmarzos-top-2017-big-data-data-science-and-iot-blogs/ https://infocus.dellemc.com/william_schmarzo/bill-schmarzos-top-2017-big-data-data-science-and-iot-blogs/#respond Wed, 24 Jan 2018 10:00:35 +0000 https://infocus.dellemc.com/?p=33781 To put us on the path for a successful and engaging 2018, here is a quick review of my top 10 blogs from 2017. #10. Is Data Science Really Science? Science works within systems of laws, such as the laws of physics, thermodynamics, mathematics, and many others. Scientists can apply these laws to understand why […]

The post Bill Schmarzo’s Top 2017 Big Data, Data Science and IOT Blogs appeared first on InFocus Blog | Dell EMC Services.

]]>
To put us on the path for a successful and engaging 2018, here is a quick review of my top 10 blogs from 2017.

#10. Is Data Science Really Science?

Science works within systems of laws, such as the laws of physics, thermodynamics, mathematics, and many others. Scientists can apply these laws to understand why certain actions lead to certain outcomes or why something is going to occur.

While there may never be “laws” that dictate human behaviors, in the world of IOT where organizations are melding analytics (machine learning and artificial intelligence) with physical products, we will see “data science” advancing beyond just “data” science. In IOT, the data science team must expand to include scientists and engineers from the physical sciences so that the team can understand and quantify the “why things happen” aspect of the analytic models. If not, the costs could be catastrophic.

Figure 1: Scientific Method Belief and Biases

 

Note: I’m adding Figure 1 to this blog to highlight the importance of the Scientific Method and understanding basic statistical techniques to ensure that one is building their analytics on unbiased data against unbiased hypotheses.

#9. Design Thinking: Future-proof Yourself from AI

While there is a high probability that machine learning and artificial intelligence will play an important role in whatever job you hold in the future, there is one way to “future-proof” your career…embrace the power of design thinking.

Design thinking is defined as human-centric design that builds upon the deep understanding of our users (e.g., their tendencies, propensities, inclinations, behaviors) to generate ideas, build prototypes, share what you’ve made, embrace the art of failure (i.e., fail fast but learn faster) and eventually put your innovative solution out into the world.  And fortunately for us humans (who really excel at human-centric things), there is a tight correlation between the design thinking and the machine learning (see Figure 2).

Figure 2: Design Thinking and Machine Learning Mapping

#8. 5 Steps to Building a Big Data Business Strategy

“The problem is that, in many cases, big data is not used well. Companies are better at collecting data – about their customers, about their products, about competitors – than analyzing that data and designing strategy around it.” “Companies Love Big Data but Lack the Strategy to Use It Effectively,” Harvard Business Review

Build a business strategy that incorporates big data. Build a business strategy that uncovers detailed customer, product, service and operational insights serve as the foundation for optimizing key operational processes, mitigating compliance and cyber-security risks, uncover new revenue opportunities, and create a more compelling, more differentiated customer or partner experience.

#7. What tomorrow’s business leaders need to know about Machine Learning.

Much of what comprises “Machine Learning” is really not all new. Many of the algorithms that fall into the Machine Learning category are analytic algorithms that have been around for decades. These include  clustering, association rules, and decisions trees. However, the detailed, granularity of the data, the wide variety of data sources, and a massive increase in computing power has re-invigorated many of these mature algorithms.

Machine learning is a type of applied artificial intelligence (AI) that provides computers with the ability to gain knowledge without being explicitly programmed. Machine learning focuses on the development of computer programs that can change when exposed to new data (see Figure 4). How can businesses, and business leaders, take advantage?

Figure 4: Supervised and Unsupervised Machine Learning Algorithms

#6. Is Blockchain the Ultimate Enabler of Data Monetization?

Blockchain is a data structure that maintains a digital ledger of transactions among a distributed network of entities.  Think of a “distributed ledger” that uses cryptography to allow each participant in the transaction to add to the ledger in a secure way without the need for a central authority or central clearinghouse (see Figure 5).

Figure 5: How to Use Blockchain Technology to Retain More Customers

Is blockchain the ultimate enabler of data and analytics monetization; creating marketplaces where companies, individuals and even smart entities (cars, trucks, building, airports, malls) can share/sell/trade/barter their data and analytic insights directly with others?

The impact that has on a company’s financials could be overwhelming, or devastating, depending upon what side of business model transformation you sit.

#5. Data is a New Currency

When you insert something, a new demand, into a circular flow,you create an economic concept called the Multiplier Effect. It is a concept that countries use to consider how to invest money and how that investment, by having it distribute though a supply chain, like the example above, will impact the economy of their country.

Multiplier Effect Definition: “An effect in economics in which an increase in spending produces an increase in national income and consumption greater than the initial amount spent.”

Figure 6: Economic Multiplier Effect

 

Data exhibits a Network Effect, where data can be used at the same time across multiple use cases thereby increasing its value to the organization. I would contend that this network effect is in fact the same thing principally as the Multiplier Effect.

#4. 5 Questions that Define Your Digital Transformation

I had the opportunity in 2017 to give a 10-minute keynote at DataWorks Summit 2017.  What sort of keynote could he give in just 10 minutes?  Ten minutes is not long for a keynote, and to be honest, I too struggled with what to say.

But after some brainstorming with my marketing experts, we came up with an idea:  Pose five questions that every organization needs to consider as they prepare themselves for digital transformation.  And while I didn’t have enough time in 10 minutes to answer those questions in a keynote, I certainly did in a blog!

Figure 7: 5 Questions that Frame Your Digital Transformation

 

You can also check out a video of my DataWorks Summit keynote presentation, complete with air guitar at the end so that I could embarrass my daughter (my presentation starts around the 39:30 mark)!

#3. Can Design Thinking Unleash Organizational Innovation?

Design Thinking, or human-centered design, is all about building a deep empathy with the people you’re designing for; generating tons of ideas; building a bunch of prototypes; sharing what you’ve made with the people you’re designing for; and eventually putting your innovative new solution out in the world (see Figure 8).

Figure 8: Stanford d.school Design Thinking Process

 

There is a good reason why Stanford’s d.school does not sit within one of their existing schools. Design thinking is used in almost all of Stanford’s schools including business, computer science, electrical, mechanical, and even healthcare.  Design thinking appears to be one of the secret sauces to Stanford’s success and cultivating the entrepreneurial spirit of its students and faculty (and neighbors, in my case).

#2. The Future Is Intelligent Apps

I have seen the future!  The future is a collision between big data (and data science) and application development that will yield a world of “intelligent apps.”

These “intelligent apps” combine customer, product, and operational insights (uncovered with predictive and prescriptive analytics) with modern application development tools and user-centric design to create a more compelling, more prescriptive user experience.

Intelligent apps will support or enable key user decisions, while continually learning from the user interactions to become even more relevant and valuable to those users.

The journey to building intelligent applications starts by understanding the decisions that key business constituents need to make in supporting their business and operational objectives.

Figure 9: Intelligent Application Stack

 

And my #1 blog of 2017 (drum roll please)…

#1. Difference between Big Data and Internet of Things

What are the differences between big data and IOT analytics? Big data analyzes large amounts of mostly human-generated data to support longer-duration use cases. IOT aggregates and compresses massive amounts of low latency / low duration / high volume machine-generated data coming from a wide variety of sensors to support real-time use cases.

I don’t believe that loading sensor data into a data lake and performing data science to create predictive analytic models qualifies as doing IOT analytics.  To me, that’s just big data (and potentially REALLY BIG DATA with all that sensor data).  In order for one to claim that they can deliver IOT analytic solutions requires big data (with data science and a data lake), but IOT analytics must also include:

  • Streaming data management with the ability to ingest, aggregate (e.g., mean, median, mode), and compress real-time data coming off a wide variety of sensor devices “at the edge” of the network.
  • Edge analytics that automatically analyzes real-time sensor data and renders real-time decisions (actions) at the edge of the network that optimizes operational performance (blade angle or yaw), or flags unusual performance or behaviors for immediate investigation (security breaches, fraud detection).

Sources:

Figure 1: Scientific Method Beliefs and Biases

Figure 4: Supervised and Unsupervised Machine Learning Algorithms

Figure 5: How to Use Blockchain Technology to Retain More Customers

 

The post Bill Schmarzo’s Top 2017 Big Data, Data Science and IOT Blogs appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/bill-schmarzos-top-2017-big-data-data-science-and-iot-blogs/feed/ 0