InFocus Blog | Dell EMC Services https://infocus.dellemc.com DELL EMC Global Services Blog Tue, 07 Aug 2018 19:04:52 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.7 3 Imperatives for Artificial Intelligence Success: People, Process, Technology https://infocus.dellemc.com/matt_maccaux/3-imperatives-for-ai-artificial-intelligence-success-people-process-technology/ https://infocus.dellemc.com/matt_maccaux/3-imperatives-for-ai-artificial-intelligence-success-people-process-technology/#respond Tue, 07 Aug 2018 15:45:15 +0000 https://infocus.dellemc.com/?p=35920 As I speak to customers about Big Data and Analytics, the topic of Artificial Intelligence (AI) inevitably comes up. I am often asked how organizations can use AI to monetize their data and transform their business processes. While Artificial Intelligence can absolutely help drive those kinds of results, there isn’t a simple answer to these […]

The post 3 Imperatives for Artificial Intelligence Success: People, Process, Technology appeared first on InFocus Blog | Dell EMC Services.

]]>
As I speak to customers about Big Data and Analytics, the topic of Artificial Intelligence (AI) inevitably comes up. I am often asked how organizations can use AI to monetize their data and transform their business processes.

While Artificial Intelligence can absolutely help drive those kinds of results, there isn’t a simple answer to these types of question because there are many challenges that must be overcome first, particularly around:

  • Having a strategy and the right processes in place
  • Skills and expertise of analytical users
  • Big Data and analytical platforms and technologies

The best way to get started is to focus on the business outcomes and analytical models and not on the underlying technologies.

Our Approach to Artificial Intelligence

Dell EMC Consulting has been helping organizations of all sizes, industries, and maturity levels adopt and accelerate their analytical journey by making data-driven decisions to select the right use cases, implement scalable platforms, and operationalize algorithms to drive business outcomes. We do this by partnering with business stakeholders to select the killer use cases, apply the right technologies to implement them, and provide expert data science and engineering resources to get quick wins to establish meaningful change throughout the organization.

We work with organizations to validate the use cases for analytical feasibility and then select the right tools and techniques, leveraging Artificial Intelligence, Machine Learning, and Deep Learning techniques. Our methodology is to not only create the analytical models leveraging these tools and techniques, but also to teach organizations how to build and maintain them over time. Our goal is to help our customers become self-sufficient by providing scalable solutions and best practices around the analytical toolsets.

For any Artificial Intelligence implementation to be successful, it’s imperative that you bridge between the people, process and technology needed to achieve the business outcomes you want.  Our Consulting practice works with customers to do just that, and helps ensure you get the most from your Big Data investments.

New Ready Solutions for AI (Artificial Intelligence)

When it comes to the technologies, Dell EMC has you covered with verified architectures and deployments in the form of Ready Solutions. Dell EMC just announced new Ready Solutions for AI with a design for Machine Learning with Hadoop and one for Deep Learning with NVIDIA. These end-to-end solutions include the Dell EMC best-of-breed hardware, software, and services needed to fast-track and simplify your AI journey.

Dell EMC Services are an integral part of the solutions, helping customers drive the rapid adoption and optimization of their AI environments from initial set up and upskilling of resources through to ongoing support.  A core set of services are included with the solutions, and other value-added services are available for customers to choose from based on their business requirements and priorities.

Some of the key areas our services teams deliver value for Ready Solution for AI customers include:

  • Implement and operationalize the Ready Solution technologies and AI libraries, and scale customer Data Engineering and Data Science capabilities
  • Provide technical architecture recommendations and data science best practices
  • Offer courses and certifications on Data Science and Advanced Analytics and workshops on Machine Learning in collaboration with NVIDIA to develop the solution and technology skills needed to fully leverage the AI capabilities
  • Provide comprehensive hardware and collaborative software support to help ensure optimal system performance and minimize downtime

Key areas our services teams deliver value for Ready Solution for AI

 

For the remainder of this blog, let’s delve into how Dell EMC Consulting helps customers operationalize and accelerate the time to value of the Ready Solutions for AI so they can start getting actionable insight from their data.

New AI Consulting Services for the Ready Solutions

To support the rapid adoption and integration of the Dell EMC Ready Solutions for AI, we offer new AI advisory and implementation services that span the initial environment set up, operationalizing the solution and building the necessary data science and data engineering capabilities.

Strategic Guidance and Expert Integration

Our consultants can engage from the start to plan and manage the Ready Solution implementation and help operationalize it in the customer’s environment.  This includes setting up a Cloudera Data Science workbench for the Hadoop design and for NVIDIA, we can help test and configure the AI libraries and tools.

Additionally, we can make architectural recommendations for your data science platform, along with providing best practices for data science, methods, tools and processes.  Defining the right, collaborative processes across teams – such as from the lines of business to data scientists and IT – is particularly important, as is industrializing the process to get as much value from your data as you can, as quickly as you can.

For more on process and tools, see these related blogs:

Achieving business results in the new environment

To realize the value from the new Dell EMC AI environment, customers need to have data science and data engineering capabilities.  Whether your organization is new to data science or you’re already far down the path, we can help get you where you need to be using a tailored approach driven by business objectives and priorities.

Data Science:

Dell EMC Consulting’s global team of data scientists will work with your business users to identify the right use cases and scope out the effort to implement them. Then, we will help your own teams, through an iterative process, develop and refine the data models and analytical algorithms that will be operationalized. We will also work with existing reporting teams to ensure that the models are measured and populating business KPIs. Finally, we will train your analytical teams on the new tools, methods, and techniques that leverage the ML/DL/AL technologies.

Data Engineering:

In addition to the Data Science expertise, Dell EMC Consulting has global expertise in data engineering to take those refined models and algorithms to an operational state. This is sometimes referred to as “monetization”, but really it is about the creation of value for your organization. This involves some of the “data plumbing” which connects the models to data sources, transforms and enriches the data, and creates output APIs to feed other applications. This is all done with an eye to security and privacy to ensure that your organization remains in compliance with regulatory and industry laws and policies. We will also work with your data architecture and engineering teams to ensure that these new capabilities integrate and align with your existing Big Data environment.

Bringing it all together

Dell EMC has the capabilities, experience, and technologies to help organizations accelerate their adoption of advanced data science technologies and techniques for Artificial Intelligence, Machine Learning, and Deep Learning.  The new Dell EMC Ready Solutions for AI offer a compelling way to fast-track and simplify AI using Dell EMC’s world class technology and proven AI expertise and services.  To learn more, contact your account executive or comment below.

The post 3 Imperatives for Artificial Intelligence Success: People, Process, Technology appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/matt_maccaux/3-imperatives-for-ai-artificial-intelligence-success-people-process-technology/feed/ 0
Dell EMC Services Named SI & Consulting Windows 10 Global Deployment Partner of the Year https://infocus.dellemc.com/john_moody/dell-emc-services-si-consulting-windows-10-global-deployment-partner-of-the-year/ https://infocus.dellemc.com/john_moody/dell-emc-services-si-consulting-windows-10-global-deployment-partner-of-the-year/#respond Fri, 03 Aug 2018 09:00:18 +0000 https://infocus.dellemc.com/?p=35907 Is Windows 10 part of your digital workplace strategy? We can help. Microsoft has named Dell EMC Services the “2018 SI & Consulting Windows 10 Global Deployment Partner of the Year.” The award was presented to Dell by the Microsoft One Commercial Partner (OCP) team at last month’s Microsoft Inspire.   2018 marks the 8th […]

The post Dell EMC Services Named SI & Consulting Windows 10 Global Deployment Partner of the Year appeared first on InFocus Blog | Dell EMC Services.

]]>
Is Windows 10 part of your digital workplace strategy? We can help.

Microsoft has named Dell EMC Services the “2018 SI & Consulting Windows 10 Global Deployment Partner of the Year.” The award was presented to Dell by the Microsoft One Commercial Partner (OCP) team at last month’s Microsoft Inspire.

Janet Chess, Aaron Whitley, Fouad Ketani, Julie Watkins, Victor Morales, Mark Cabot, Nagi Punyamurthula, Wade Parsons

 

2018 marks the 8th consecutive year that Microsoft has recognized Dell as its “Partner of the Year” for Windows deployment.  Dell also earned this year’s “FY18 US Marketing Top Commercial Windows 10 Deployment Partner” award from the US Microsoft 365 team.

Bill Clark, Phil Dakas, Margaret Arakawa, Chris Wright, Mark Cabot, Julie Watkins, Cyril Belikoff

 

The awards recognize Dell EMC Services as the #1 Global SI (Systems Integrator) for Windows 10—as well as the investments that Dell has made to make it easier for enterprises worldwide to deploy and migrate to Windows 10 and leverage the Microsoft 365-Modern Workplace.

Windows 10 is ready…  are you?

Windows 10 is the fastest-growing Windows operating system in history.

In today’s world, with security of utmost importance, Windows 10 is Microsoft’s most secure Client OS to-date. For example, the Windows 10 secure boot process ensures only approved firmware can be loaded and only approved apps started. Windows 10 also enables automatic encryption of data and multifactor authentication.

Since most breaches originate from the client device, enterprises cannot afford to wait to migrate to Windows 10.

If your organization hasn’t fully migrated to Windows 10, you’re not alone. But with the clock ticking down to January 14, 2020 when support for Windows 7 ends, now is a good time to investigate what Dell EMC Services can do as your Windows 10 solution provider.

Start with a big-picture Workforce Transformation strategy for how to best leverage Windows 10 as part of a cohesive digital workplace and end-user services plan.  We can help customize and integrate end-to-end Microsoft solutions into the existing environment and determine the right mix of on-premises, cloud, or hybrid deployment for Windows 10 and Office 365.

The Dell Client Deployment Assessment service evaluates current practices and recommends and develops an action plan for Windows 10 migration. If device renewal is part of the plan, the Dell ProDeploy Client Suite service provides expert configuration and installation, reducing deployment time for new client systems by up to 35 percent.[1]  With Dell PC as a Service, organizations can opt to combine hardware, software, lifecycle management and financing into one price for everything per-seat-per-month service.

What about WaaS?

With the new Windows as a Service (WaaS) OS upgrade model, Windows releases become incremental.  Updates are delivered in smaller packages more frequently—a couple of times a year, rather than in one big package every 3-5 years. Users and administrators get new features and capabilities faster—and IT can say goodbye to disruptive, large-scale Windows OS migrations.

While incremental updating is a routine event with over-the-counter software and mobile devices, WaaS will shift how IT manages its enterprise Windows environment.  To help IT organizations ease the transition to Windows 10 updates as an ongoing maintenance activity, we created Dell Windows as a Service. Developed with Microsoft and built on the Microsoft WaaS framework, this new managed service leverages Dell EMC’s global Windows expertise to help internal teams take a policy-based and programmatic approach to updates and get up to speed on everything from identifying applications for additional testing to handling pilot release and production rollouts.

Most secure Windows ever

Another new managed service: Dell Windows Defender Advanced Threat Protection Managed Service is designed to help IT organizations augment the strength of Windows Defender ATP in Windows 10. Dell security professionals apply a range of cybersecurity methodologies and solutions to help protect users, data, and IP. The service helps organizations prevent and detect cyberattacks, as well as provides specific guidance and support for investigating and responding to attacks and other malicious behaviors.

All-in with Windows 10

At Dell, we’re all in with Windows 10.  It’s why we migrated 100% of our own workforce to Windows 10 in 2017—and why we’re working with companies of all kinds to accelerate and ease migration to Microsoft 365 (integrating Windows10 Enterprise, Office 365 Suite and Enterprise Mobility + Security (EM+S)).

Learn more about how Dell EMC consulting, deployment, integration, training and support services make it easier to enable people to work and collaborate securely, anywhere, on any device, to get digital business done. Contact your Dell EMC representative. 

[1] May 2016 Principled Technologies Report commissioned by Dell EMC. Testing results extrapolated from a 10-system deployment to project time savings for larger deployment compared to in-house manual deployment. Actual results will vary.

The post Dell EMC Services Named SI & Consulting Windows 10 Global Deployment Partner of the Year appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/john_moody/dell-emc-services-si-consulting-windows-10-global-deployment-partner-of-the-year/feed/ 0
10 Key Business Considerations When Planning Your Microsoft Office 365 Migration Strategy [eBook] https://infocus.dellemc.com/ed_woodrick/10-key-business-considerations-planning-microsoft-office-365-migration-strategy-ebook/ https://infocus.dellemc.com/ed_woodrick/10-key-business-considerations-planning-microsoft-office-365-migration-strategy-ebook/#respond Tue, 31 Jul 2018 20:51:39 +0000 https://infocus.dellemc.com/?p=35812 Adopting Office 365 is a no-brainer for many organizations, which is why so many have started or plan to make the move. It’s pretty easy, just buy your Office 365 subscription and viola you’re on your way. Or are you? Well, not quite. I’ve personally helped many organizations migrate to Office 365 and it never […]

The post 10 Key Business Considerations When Planning Your Microsoft Office 365 Migration Strategy [eBook] appeared first on InFocus Blog | Dell EMC Services.

]]>
Adopting Office 365 is a no-brainer for many organizations, which is why so many have started or plan to make the move. It’s pretty easy, just buy your Office 365 subscription and viola you’re on your way. Or are you?

Well, not quite.

I’ve personally helped many organizations migrate to Office 365 and it never fails, customers are often surprised by all the considerations they need to give thoughtful attention. I suppose that’s why they partner with us, since we’ve helped so many customers successfully migrate and adopt Office 365 services.

In my experience, organizations often make the mistake of thinking they can just move from on-premises to the cloud. Well, it’s not quite that easy. Office 365 is not just the traditional functionality of the on-premises productivity suite of applications. It’s a whole new suite of cloud services designed to change the way people work. Making the move, however, is not as simple as it may initially seem. Based on my extensive experience,I’ve identified 10 considerations your business should contemplate prior to beginning your Office 365 migration journey.

Microsfoft Office 365 Migration Infographic

10 Considerations for Planning Your Move to Microsoft Office 365 Infographic

Let’s walk through them one at a time:

  • Security & Trust – Maintaining security and trust is critical for a successful move to the cloud or hybrid environment. You’ll need to understand how to protect private and sensitive data, including users under the auspices of eDiscovery. You may want to consider using Office 365 features such as Data Loss Protection and encryption to protect this content.
  • Identity Requirements – Office 365, as a multi-tenant cloud service, has some requirements for identity that, while often consistent with Active Directory implementations, may be inconsistent in some instances. There may be some actions required on your part to align with Office 365 requirements and recommendations.
  • Business Resiliency – While Office 365 will provide you with the majority of your business resiliency needs, your on-premises infrastructure may need to be optimized to support cloud resiliency.
  • Network Readiness – Moving to Office 365 changes network traffic patterns, therefore, it’s important to identify and manage risks. Understand the impact these workloads will have on your network to ensure it’s capable of meeting the required performance metrics. Keep in mind, your users’ Office 365 experience relies on network quality, which is why you’ll want to ensure there’s no negative impact to their perceptions of the new services.
  • Migration & Integration – There are lots of moving parts to consolidate and integrate with Office 365. Discovering existing content sources and determining which to migrate and integrate or retire altogether requires a methodical and comprehensive approach. You’ll want to plan optimal migration and integration scenarios based on business groups, branch offices, application groups and geographical location requires careful planning and coordination.
  • Accommodate Exceptions – It’s important to identify any entities that require special handling and those who may have migration restrictions so you can plan accordingly. There may be applications that restrict upgrading or environments not capable of supporting Office 365.
  • Customization & Third-party Add-ons – Custom applications can be the reason why organizations delay moving to Office 365. Not only is it a big effort to re-write custom widgets and apps, but many are concerned they’ll lose the ability to customize the experience as much in Office 365 as they can on-premises.
  • Management & Support – Transitioning to cloud-based services can be challenging for IT Teams, who are used to managing the entire ecosystem. Administration of Office 365 as a cloud-based platform can be challenging because much of it is managed outside the business, requiring the IT team to adopt a whole new way of managing the business.
  • Drive Value with Organizational Cultural Change – Adopting Office 365 is a fundamental cultural change for the entire organization. Specifically for IT, it’s a whole new mindset, which can be one of the most challenging aspects of transforming from an organization previously focused on infrastructure and platforms to user-centric applications and services. Continuous communication and feedback with end-users becomes the new norm.
  • Microsoft Partners – Microsoft understands the importance of having certified partners who can help you gain the most value from your Office 365 investment. That’s why they grant competencies to partners who’ve demonstrated proficiency in a particular area, by having both skilled employees as well as satisfied customers. You’ll want to work with a partner who has strong Microsoft experience.

If you’d like to understand our perspective in more depth, view our new eBook “Planning for Microsoft Office 365 – What to Consider as Part of Your Strategy“.

Office 365 Migration Strategy ebook

Don’t Go It Alone!

Moving to Office 365 can help your organization deliver the modern experiences employees have come to expect for achieving new heights in productivity and innovation. And it can help you deliver lower and more predictable IT spending. While there are many dependencies, migrations and integrations to consider as you plan your Office 365 migration strategy, you don’t have to go it alone.

Dell EMC is a Gold Certified Microsoft Partner, so whether you’re just starting or have already begun your Office 365 journey, we’re here to help. Dell EMC has helped customers of all sizes and complexities successfully adopt Office 365 and will bring that breadth of experience to your project. Contact your Dell EMC representative or find a resource in your area to learn how we can help you.

 

The post 10 Key Business Considerations When Planning Your Microsoft Office 365 Migration Strategy [eBook] appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/ed_woodrick/10-key-business-considerations-planning-microsoft-office-365-migration-strategy-ebook/feed/ 0
10 Key Business Considerations When Planning Your Microsoft Office 365 Migration Strategy [Infographic] https://infocus.dellemc.com/ed_woodrick/10-key-business-considerations-when-planning-your-microsoft-office-365-migration-strategy-infographic/ https://infocus.dellemc.com/ed_woodrick/10-key-business-considerations-when-planning-your-microsoft-office-365-migration-strategy-infographic/#respond Tue, 31 Jul 2018 20:24:26 +0000 https://infocus.dellemc.com/?p=35818 Microsoft Office 365 is not just the traditional functionality of the on-premises productivity suite of applications. It’s a whole new suite of cloud services designed to change the way people work. Making the move, however, is not as simple as it may initially seem. Based on my extensive experience, I’ve identified 10 considerations your business […]

The post 10 Key Business Considerations When Planning Your Microsoft Office 365 Migration Strategy [Infographic] appeared first on InFocus Blog | Dell EMC Services.

]]>
Microsoft Office 365 is not just the traditional functionality of the on-premises productivity suite of applications. It’s a whole new suite of cloud services designed to change the way people work. Making the move, however, is not as simple as it may initially seem. Based on my extensive experience, I’ve identified 10 considerations your business should contemplate prior to beginning your Office 365 migration journey.

The infographic below highlights the 10 considerations:

  • Security & Trust
  • Identity Requirements
  • Business Resiliency
  • Network Readiness
  • Migration & Integration
  • Accommodate Exceptions
  • Customization & Third-party Add-ons
  • Management & Support
  • Drive Value with Organizational Cultural Change
  • Microsoft Partners

Microsoft Office 365 Migration Infographic

If you’d like to understand our perspective in more depth, view our new eBook “Planning for Microsoft Office 365 – What to Consider as Part of Your Strategy”.

Office 365 Migration Strategy eBook

Don’t Go It Alone!

Moving to Microsoft Office 365 can help your organization deliver the modern experiences employees have come to expect for achieving new heights in productivity and innovation. And it can help you deliver lower and more predictable IT spending. While there are many dependencies, migrations and integrations to consider as you plan your Office 365 migration strategy, you don’t have to go it alone.

Dell EMC is a Gold Certified Microsoft Partner, so whether you’re just starting or have already begun your Office 365 journey, we’re here to help. Dell EMC has helped customers of all sizes and complexities successfully adopt Office 365 and will bring that breadth of experience to your project. Contact your Dell EMC representative to learn how we can help you.

Otherwise comment below and lets chat!

The post 10 Key Business Considerations When Planning Your Microsoft Office 365 Migration Strategy [Infographic] appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/ed_woodrick/10-key-business-considerations-when-planning-your-microsoft-office-365-migration-strategy-infographic/feed/ 0
Industrializing the Data Value Creation Process https://infocus.dellemc.com/doug_cackett/industrializing-the-data-value-creation-process/ https://infocus.dellemc.com/doug_cackett/industrializing-the-data-value-creation-process/#respond Tue, 24 Jul 2018 12:30:33 +0000 https://infocus.dellemc.com/?p=35753 For organizations to maximize the data value creation process, it’s critical to have a clear line of sight from their business strategy and stakeholders through to the decisions that could be improved by applying machine learning and other techniques to the available data. In recent months, what we’ve increasingly seen is Chief Data Officers taking […]

The post Industrializing the Data Value Creation Process appeared first on InFocus Blog | Dell EMC Services.

]]>
For organizations to maximize the data value creation process, it’s critical to have a clear line of sight from their business strategy and stakeholders through to the decisions that could be improved by applying machine learning and other techniques to the available data.

In recent months, what we’ve increasingly seen is Chief Data Officers taking a more active role in facilitating that process, focusing more on desired business outcomes and value delivery, and in doing so transforming themselves into Chief Data Monetization Officers. See the related blog, Data Monetization? Cue the Chief Data Monetization Officer.

For those outcomes to be fully realized and to create value on a true industrial scale, organizations need to have a laser focus on the process – automating steps and reducing waste to dramatically reduce the overall cost and time to insight for the production of “analytical widgets” in our “Data Factory”. If you think about it, that’s exactly what we’ve seen happening in the manufacturing world since the very first Model T rolled off the production line at Ford – why should the world of data be any different?

The Data Factory process really consists of 3 key steps. In the rest of this blog, I’ll outline each step and suggest how we might do more to industrialize the process.

Figure 1: Data Value Creation Process

Step 1 – Discover

The first step in the value chain is to explore the data in order to find something, to discover a pattern in the data that we might be able to apply in a subsequent step to create value for the business. Without Discovery, all you have in the data lakes is lots of data. That’s lots of costs and not a lot of value. None in fact, so this is perhaps the most important step in the process.

Discovery could be just about anything but most often we will be looking to optimize a customer interaction, such as applying personalization elements to an application to make content or an offer more relevant and compelling. Applying the personalization comes in Step 2, but before we get there, we need to uncover the pattern that allows us to be more personal.

To find patterns in Discovery, the data scientist will iterate through a number of steps to prepare data, build a model and then test it until the best one is found. The process is iterative as many factors can be changed such as the way data is prepared, the algorithm used and its parameters. As a model is a representation of knowledge or reality, it is never perfect. The Data Scientist will be looking for the one that performs the best for that specific business problem.

You can think about the value at this stage as personal value. Value to the data scientist in what they have learned, not commercial value to the organization. For that to happen, we need to operationalize the pattern we found by applying the model. See step 2 below.

Testing Models with Machine Learning and Data Science

This isn’t meant to be a data science primer but before we move into the Monetize step, it might be helpful to briefly review some of the basics around Data Science.

To keep it simple, let’s imagine we have a classification problem where we are trying to predict which customers are most likely to respond to a particular marketing campaign and we are going to build a classification model using existing sales and customer data so we can do just that.

To avoid over-fitting the data and ensuring that the model is accurate in the future when new data is applied, we split our data and keep some back so we can test our model with data it has not seen during the training process. We can then tabulate the results into a “confusion matrix” and look at the type of errors made and the general classification rate. False positives are where the model predicted a purchase and no purchase was made and a false negative is the other way around.

Whether any model is good or bad is very contextual. In our case, the 200 false positives may be great if the cost of the campaign is low (email) but may be considered poor if the campaign is expensive or these are our best customers and they’re getting fed up with being plagued with irrelevant offers! The situation is similar with the false negatives. If this is your premium gateway product and there is any chance of someone purchasing it, you may decide this misclassification is OK, however if it’s a fraud problem and you falsely accuse 300 customers then that’s not so great. See the blog on Is Data Science Really Science for more on false positives.

Figure 2: Sample Model Prediction (Confusion Matrix)

When we score our testing data, the model makes a prediction of purchase or non-purchase based on a threshold probability, typically 0.5. As well as changing the model algorithm and parameters, one of the other things the Data Scientist might do is to alter the threshold probability or misclassification cost to see how it impacts the errors in the confusion matrix, making adjustments based on required business goals so the best overall model is found.

Another approach to optimizing marketing campaign effectiveness is to rank results using “expected value” which we calculate by multiplying the probability of a purchase by the expected (purchase) value, often using the customer’s average previous purchase value as a proxy.

For example, we might want to mail the top 10,000 prospects and maximize income from the campaign so we rank our customers by expected value and select the top 10,000. In this way, someone with a probability of 0.3 but average purchase value of $1000 would be higher in our list than someone with a much higher probability of 0.8 and lower average value of $100 (expected value of 300 vs 80).

I’ve just used a simple example here to avoid confusion – the real world is rarely that straight forward of course. We may need to boost or combine models or tackle unsupervised modeling techniques, such as clustering, that are non-deterministic and therefore require greater skills on the part of the Data Scientist in order to be effective.

Step 2 – Monetize

It’s worth noting that I’m using the word “monetize” here as shorthand for “creating value” from the data. I’m not suggesting selling your data, although that may be the case for a limited set of companies. It may also have nothing to do with actually making money – in the case of a charity or government organization the focus will be on saving costs or improving service delivery – but the same broad principles remain the same regardless.

It’s also worth noting that not all of the models coming out of the Discovery step will need to be operationalized into an operational system such as an eCommerce engine. It may be that the insights gained can simply help to refine a manual process. For example, a retailer might benefit from looking at the profile of customers purchasing a particular group of fashion products to see how it aligns to the target customer segment identified by the merchandising team.

Having said that, in most cases, more value is likely to be created from applying AI and machine learning techniques to automated processes given the frequency of those decision points.

We will focus more on this aspect in the remaining part of this blog.

For those problems where we are looking to automate processes, the next thing we need to do is to monetize our model by deploying it into an operational context. That is, we set it into our business process to optimize it or to create value in some way such as through personalization. For example, if this was an online shopping application we might be operationalizing a propensity model so we display the most relevant content on pages or return search results ranked in relevance order for our customers. It’s these kinds of data-driven insights that can make a significant difference to the customer experience and profitability.

What we need to do to operationalize the model will depend on a number of factors, such as the type of model, the application that will consume the results of the model and the tooling we’re using. At its simplest, commercial Data Science tooling like Statistica and others have additional scoring capabilities built in. At the other end of the spectrum, the output from the Discovery process may well just land into the agile development backlog for implementation into a new or existing scoring framework and associated application.

Step 3 – Optimize

I’ve already mentioned that no machine learning model is perfect and to further complicate things, its performance will naturally decay over time – like fine wines, some may age delicately, while others will leave a nasty taste before you get it home from the store!

That means we need to monitor our models so we are alerted when performance has degraded beyond acceptable limits. If you have multiple models and decision points in a process, one model may also have a direct impact on another. It is this domino effect of unforeseen events which makes it even more important not to forget this step.

Another area where the Data Scientist will have a role to play is in the refinement of model testing to ensure statistical robustness. To fast track the process, a Data Scientist may combine many branches of a decision tree into a single test to reduce the number of customers needed in the control group when A:B testing to understand model lift.

Having been alerted to a model that has been degraded through this kind of testing, we’ll need to refresh the model and then re-deploy as appropriate. In many cases, we may just need to re-build the model with a new set of data before deploying the model again. Given that the data and model parameters are going to remain unchanged, this type of task could readily be undertaken by a more junior role than a Data Scientist. If a more complete re-work of the model is required, the task will be put into the Data Scientist backlog funnel and prioritized appropriately depending on the criticality of the model and impact on profits.  Although there is more work involved than just a simple re-calibration, it will still likely be far quicker than the initial development given more is known about the independent variables and most, if not all, of the data preparation will have been completed previously.

Just like in the previous step, if you are using commercial Data Science software to deploy your models, some model management capability will come out of the box. Some may also allow you to automate and report on A:B testing across your website. However, in most instances, additional investments will be required to make the current operational and analytical reporting systems more agile and scalable to meet the challenges placed on them by a modern Digital business. If the business intelligence systems can’t keep pace, you will need to address the issue one way or another!

Industrializing the Process

Techniques and approaches used in modern manufacturing have changed immeasurably since Henry Ford’s day to a point where a typical production line will receive parts from all over the world, arriving just in time to make many different products – all managed on a production line that just doesn’t stop. Looking back at our 3 steps by comparison, it’s clear we have a lot to learn.

A well-worn phrase in the industry is that a Data Scientist will spend 80% of their time wrangling data and only 20% doing the Science. In my experience, Data Scientists spend the majority of their time waiting for the infrastructure, software environment and data they need to even start wrangling (see my related blog, Applying Parenting Skills to Big Data: Provide the Right Tools and a Safe Place to Play…and Be Quick About It!). Delays brought about while new infrastructure is provisioned, software stacks built, network ports managed and data secured all add to the time and costs for each of the data products you’re creating. As a result, the process is often compromised with Data Scientists forced to use a shared environment or a standardized toolset. Without care and careful consideration, what we tend to do is to make what is after all, a data discovery problem, into an IT development one! There’s no ‘just in time’ in this process!

What if you could automate the process and remove barriers in the Discovery phase altogether?

The benefits could be huge!  Not only does that make best use of a skilled resource in limited supply (the Data Scientist), but it also means that downstream teams responsible for the Monetize and Optimize steps can schedule their work as the whole process becomes more predictable. In addition to the Data Science workload, what if the environment and toolchain required by the agile development team to Monetize our model (step 2) could also be automated?

Much can also be done with the data to help to accelerate the assembly process. Many types of machine learning models can benefit from data being presented in a “longitudinal” fashion. It’s typical for each Data Scientist to build and enhance this longitudinal view each time more is discovered about the data. This is another area that can benefit greatly from a more “industrialized view” of things – by standardizing data pre-processing (transformation) steps we improve quality, reduce the skills required and accelerate time to discovery. This is all about efficiency after all, but that also means we must add the necessary process so individual learning can be shared among the Data Science community and the standardized longitudinal view enhanced.

Back to Big Data Strategy

The point we started with was that creating value from data requires broader thinking than just a Big Data strategy. By looking in detail at the 3 steps in the value creation process, organizations can begin to unlock the potential value trapped in their data lakes and industrialize the process to eliminate costs and create greater efficiency with improved time to value.

At Dell EMC, we’re working with many of our customers to assess and industrialize their data value creation process and infrastructure. We’ve also created a standardized architectural pattern, the Elastic Data Platform, which enables companies to provide ‘just in time’ data, tools and environments for Data Scientists and other users to expedite the Discovery process. To learn more, check out this video featuring my colleague Matt Maccaux:

To learn even more about Data Monetization and Elastic Data Platform from Dell EMC experts, read the InFocus blogs:

Driving Competitive Advantage through Data Monetization

Avoid the Top 5 Big Data Mistakes with an Elastic Data Platform

Elastic Data Platform: From Chaos to Olympian Gods

 

The post Industrializing the Data Value Creation Process appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/doug_cackett/industrializing-the-data-value-creation-process/feed/ 0
Embracing the Dell Digital Way: How We Run IT to Succeed https://infocus.dellemc.com/bask_iyer/embracing-the-dell-digital-way-how-we-run-it-to-succeed/ https://infocus.dellemc.com/bask_iyer/embracing-the-dell-digital-way-how-we-run-it-to-succeed/#respond Tue, 24 Jul 2018 12:30:09 +0000 https://infocus.dellemc.com/?p=35793 CIOs and IT professionals have always hustled to keep critical systems at optimal performance and innovate wherever and whenever possible. However, with digital transformation on top of every CEO’s list of priorities, IT organizations must expedite how we deliver faster and better business outcomes while continually enhancing the performance and efficiency our users rely on. […]

The post Embracing the Dell Digital Way: How We Run IT to Succeed appeared first on InFocus Blog | Dell EMC Services.

]]>
CIOs and IT professionals have always hustled to keep critical systems at optimal performance and innovate wherever and whenever possible. However, with digital transformation on top of every CEO’s list of priorities, IT organizations must expedite how we deliver faster and better business outcomes while continually enhancing the performance and efficiency our users rely on.

It’s been clear for a while now that bimodal IT is not the answer because it creates too much complexity, especially for me as a CIO overseeing two industry-leading IT organizations. While technology has an important role to play, we need to transform ourselves and how we run IT to succeed in this digital world.

Here’s how:

First, at Dell, we call this cultural shift the Dell Digital Way. In addition to aligning our leaders closely with the company’s business unit leads, we’re cross-functionally collaborating to design, develop and deliver new products and capabilities much faster, and more importantly, iteratively. This operating model is essential to our success.

Second, our team members are core to this transformation. In addition to recruiting and retaining passionate technical team members, we need to embrace agile methodologies and other modern programming techniques and provide plenty of innovative digital and product development opportunities. In essence, we need to inspire innovation.

And, finally, embracing agile alone isn’t enough. We still need to provide the structure and foundation needed to keep things on the rails. As we increase agile adoption, we must also standardize and improve our application lifecycle processes and portfolio governance, financing and reporting and even how we charge our time. This is the business side of IT.

I recently had an opportunity to discuss digital transformation and the evolving role of the CIO with the experts at Deloitte. Check out the following video and Wall Street Journal article to learn more. And, I would love to hear your thoughts as well. After all, whether you’re a CEO, business partner or an IT practitioner, we are all in this together.

 

The post Embracing the Dell Digital Way: How We Run IT to Succeed appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/bask_iyer/embracing-the-dell-digital-way-how-we-run-it-to-succeed/feed/ 0
Demystifying Software-Defined Networks: A Decade Later, Where Are We Now (Part II)? https://infocus.dellemc.com/javier_guillermo/demystifying-software-defined-networks-a-decade-later-where-are-we-now-part-ii/ https://infocus.dellemc.com/javier_guillermo/demystifying-software-defined-networks-a-decade-later-where-are-we-now-part-ii/#comments Mon, 09 Jul 2018 09:00:22 +0000 https://infocus.dellemc.com/?p=35646 In part IV of this SDN/NFV blog series, I talked about how we arrived to the present situation of SDN, a bit of its history, as well as future plans. In addition, I cited the key implementation of Google’s SDN as an example. Deployment Example #2 Another company that is making use of SDN is […]

The post Demystifying Software-Defined Networks: A Decade Later, Where Are We Now (Part II)? appeared first on InFocus Blog | Dell EMC Services.

]]>
In part IV of this SDN/NFV blog series, I talked about how we arrived to the present situation of SDN, a bit of its history, as well as future plans. In addition, I cited the key implementation of Google’s SDN as an example.

Deployment Example #2

Another company that is making use of SDN is Gap Inc. – yes, it’s not a technology company but the American fashion icon and parent company of Old Navy, Banana Republic and Intermix. They’re using an application of SDN to connect its Internet stores to one another in the corporate network. In the words of the Mr. Patel, Senior Network Architect and CTO of the SD-WAN-SDN project at GAP, “This software approach is about 50% less expensive than the conventional method of connecting stores together in a wide area network. Companies with many stores or branch offices are beginning to look at SDN networking to connect their stores together, but Gap is one of the first to implement this technology at scale and make public its efforts.”

Deployment Example #3

Microsoft Azure is my third example. The Redmond giant is actively implementing SDN for its Azure Public Cloud. In the words of Albert Greenberg, Microsoft lead technologist: “One of the key principles behind Azure’s SDN is its ‘Virtual Layer 2 Architecture,’ a Layer 3 Cross-Fabric that spans the entire data center.” He continues, “Automation is key to managing a massive, high-bandwidth network built with commodity components. The network state service that Azure uses as its control plane abstracts away the individual networks.”

To be able to mix and match the best network element hardware, Microsoft followed a very similar approach to Facebook, using the open source Switch Abstraction Interface (SAI) API to program the ASICs of the switches.

We also have major carriers like AT&T with its Domain 2.0 initiative and Telefonica with Unica. AT&T had a data traffic increase of 100,000% on its wireless network (not a typo) since 2009 to present day. That’s why they needed SDN and NFV implemented across their network because it’s the only technology that allows adding capacity faster with automated deployment and pushes out fast upgrades at the speed of the Internet. AT&T is still planning to have 75% of their network virtualized by 2020.

So What’s the Conclusion?

SDN and NFV are both a bit of hype and disruptive reality. I concur on the view that if SDN and NFV were just bubbles, they would have burst by now. It’s because there’s a real need for both of these technologies that the industry has kept investing in them.

If we think about it, we have been using CLI to configure L2-L3 Network elements for the last 35+ years without many major changes. It is true that we have new protocols, the bandwidth, processing power, and capacity has skyrocketed, as well as the complexity of the networks, but the job of network engineers didn’t change much until we started pushing for SDN/NFV[1]. The paradigm changed – centralized control, separation of planes, abstraction, generic hardware, open source, etc. It is also worth noting that part of the difficulty of its implementation is not just technical, but cultural: In most organizations, networking is typically siloed from the rest of the IT organization. The new approach to networking, SDN and DevOps, requires a different mindset and it takes convincing, training, effort and time.

SDN Forecasts

According to a report from BCC research, the estimation for SDN global revenues will jump to over $56 billion in 2020, plus, in the near future, 100G will be the norm and manual control won’t be enough. We will need to have automation all across the network from the top to the bottom. The agility that SDN and NFV technologies provide will be key soon.

I believe these technologies have spearheaded the biggest leap in networking technology over the last 20 years, it’s just taking a bit longer to completely take over. Once we have the problematic interoperability and standardization figured out, we’ll have massive implementations across the board.

Initial focus of OPNFV was in between 2008-2015; present and future focus is 21016-onward.

We have many initiatives, like ONAP and OPNFV, just around the corner. ONAP (a project combining AT&T’s ECOMP and Open-O[2]) provides for automatic, policy-driven interaction of these functions and services in a dynamic, real-time cloud environment. It’s not just a run-time platform; it includes graphical design tools for function/service creation, delivering capabilities for the design, creation, orchestration, monitoring, and lifecycle management of VNFs, SDNs and high level services. OPNFV is focusing on the higher layers with quality open source software for the virtualization layer, specifically on the ETSI NFV interfaces VI-Ha, Vn-Nf, Nf-Vi, Vi-Vnfm, and Or-Vi in the diagram above (from ETSI architecture)[3].

Summary

As a closing thought, I’d say 2020 will be the date for the massive implementation of SDN/NFV. We’ve come a long way in just 10 years but work needs to be done on the higher layers in particular; on orchestration and consolidation of the platforms we have in place, and improving the interoperability on automation features.

Part of the Blog Series

Demystifying Software Defined Networks Part III: SDN via Overlays

Demystifying Software-Defined Networks Part II: SDN via APIs

Demystifying Software-Defined Networks Part I: Open SDN Approach

Sources

[1] SDN | sdx central

[2] How ONAP Will Merge Millions of Lines of Code from ECOMP and Open-O

[3] ETSI Standards

 

The post Demystifying Software-Defined Networks: A Decade Later, Where Are We Now (Part II)? appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/javier_guillermo/demystifying-software-defined-networks-a-decade-later-where-are-we-now-part-ii/feed/ 1
Windows 10: Complex Migration or Modern IT Transformation? https://infocus.dellemc.com/dan_oldroyd/windows-10-complex-migration-or-modern-it-transformation/ https://infocus.dellemc.com/dan_oldroyd/windows-10-complex-migration-or-modern-it-transformation/#respond Tue, 03 Jul 2018 09:00:13 +0000 https://infocus.dellemc.com/?p=35725 If your organization still hasn’t fully migrated to Windows 10, you’re not alone. But with less than two years before extended support for Windows 7 ends, it’s time for businesses to kick their efforts into high gear. I know what you’re thinking – easier said than done, right? Trying to understand what is required, when […]

The post Windows 10: Complex Migration or Modern IT Transformation? appeared first on InFocus Blog | Dell EMC Services.

]]>
If your organization still hasn’t fully migrated to Windows 10, you’re not alone. But with less than two years before extended support for Windows 7 ends, it’s time for businesses to kick their efforts into high gear.

I know what you’re thinking – easier said than done, right? Trying to understand what is required, when it needs to happen and what it means for your users is enough to make your head spin. And when you start considering the time and resources needed to make the move, you’d rather bury that spinning head in the sand.

One of the things I really enjoy about my job is interaction with customers. I genuinely like hearing about the challenges, frustrations and opportunities facing IT managers today. I’ve heard a lot of comments like “my environment is stable, why do I need the disruption of moving to Windows 10?”

Luckily, migrating to Windows 10 doesn’t have to be a complicated, headache-inducing challenge. In fact, with a little background information, a plan to move forward and the right partner on your journey, migrating to Windows 10 is an opportunity to modernize IT management for your organization.

And don’t just take my word for it. Our organization is one that has taken the leap and now all-in with Windows 10. We successfully migrated 100% of Dell’s workforce in 2017 – that’s approximately 150,000 upgraded PCs. And we’ve helped countless others make the transition as well.

Read on, because you’re next!

Meeting the Needs of the Evolving Workforce

According to Microsoft, 22 million commercial users have already migrated to Windows 101 and are realizing the benefits of the improved operating system – its deep integration of cloud services, universal app capabilities and enhanced security tools – and a new Windows as a Service model that delivers updates in smaller packages more frequently. (This could be your last large-scale rollout ever!)

Simply put, IT environments based on Windows 10 are more agile, more mobile, more effective and more secure. This is essential to meet the needs of a growing workforce who want easy, on-demand, collaborative experiences and the flexibility to work anywhere, anytime.

Best of Both Worlds with Hybrid Deployment

If Windows 10 is going to help modernize your IT environment and ultimately, how your users get things done – shouldn’t the deployment model transform as well? The ability to easily deploy and manage systems via cloud technology is a major perk of Windows 10 (PCs are essentially turned into thin clients for remote, over-the-air provisioning with solutions like VMware’s Workspace ONE or Microsoft’s AutoPilot).

However, the success of this model depends on the quality and speed of users’ connections, the size of the image and the number of locally installed applications which can significantly hamper the “out-of-the-box” experience. Although we believe this cloud-based model is the future of deployment, the reality is that most IT environments will require a hybrid approach in the near term to accommodate existing IT infrastructures and processes. Hybrid deployment offers the best of both worlds – the speed, control and reliability of factory configuration/installation with the scalability and cost-efficiencies of ongoing management via the cloud.

The Way Forward

So you know Windows 10 is coming, you know it’s important, and you know the benefits it will bring, but you still have concerns. How will you efficiently roll out this new OS, and what if users require new hardware to take full advantage of the features? Can you easily deploy new PCs at the same time?

We’ve been spending the last couple years helping customers through the various stages of their migrations with services designed to make the journey easier:

  1. Client Deployment Assessment – We assess your current IT practices, make recommendations and help develop an actionable plan for adopting Windows 10.
  2. ProDeploy Client Suite – Once the plan is in place, you can reduce deployment time for new client systems up to 35 percent2 with Dell experts who will manage configuration and installation.
  3. PC as a Service – We combine hardware, software, PC lifecycle services and financing into one all-encompassing solution that provides you with a single price per seat per month.

Summary: The Time Is Now

Change is coming and we are here to help. I look forward to continuing my conversations with IT managers and hearing how things are progressing as they make this shift. With the right plan and the right partner, January 14th, 2020 is not just the deadline for migrating to Windows 10, it’s also the beginning of your transformation towards modern IT management.

Sources

1 Windows 10 Embracing Silicon Innovation
2 May 2016 Principled Technologies Report commissioned by Dell EMC. Testing results extrapolated from a 10-system deployment to project time savings for larger deployment compared to in-house manual deployment. Actual results will vary.

The post Windows 10: Complex Migration or Modern IT Transformation? appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/dan_oldroyd/windows-10-complex-migration-or-modern-it-transformation/feed/ 0
Are Your IT Resources Struggling to Tackle Digital Transformation? https://infocus.dellemc.com/david_mensing/resources-struggling-tackle-digital-transformation/ https://infocus.dellemc.com/david_mensing/resources-struggling-tackle-digital-transformation/#respond Mon, 02 Jul 2018 09:00:45 +0000 https://infocus.dellemc.com/?p=35541 We spend a lot of time talking to customers at Dell Technologies World, and this year we noticed that most of those conversations centered on transformation. We’re living in an exciting time. There are a lot of great technology options for customers looking to transform their business. And yet, adopting new infrastructure can be a […]

The post Are Your IT Resources Struggling to Tackle Digital Transformation? appeared first on InFocus Blog | Dell EMC Services.

]]>
We spend a lot of time talking to customers at Dell Technologies World, and this year we noticed that most of those conversations centered on transformation. We’re living in an exciting time. There are a lot of great technology options for customers looking to transform their business. And yet, adopting new infrastructure can be a challenge. Not having the expertise or the resources needed reduces your capacity to grow and weakens your ability to focus on the transformation activities relevant to your business. New technology should translate into benefits and opportunities-–it shouldn’t be another challenge or burden on your IT staff.

Dee Rumford, Dell EMC Director of Services Sales, and I recently discussed one solution with Dell TV reporters at Dell Technologies World. We spent some time talking about transformations, the challenges customers face and how we help address those challenges.

Watch this video to learn more:

Digital transformation has been an enormous theme at this year’s show. What are some of the challenges companies face as part of that journey?

David: We constantly hear from customers that they’re worried about adopting new technology. They have to be able to transition successfully from their current technology to the next generation product to keep up with their evolving business demands. The other challenge they face is how to effectively utilize their technology. They need to decide whether it makes sense to enable new features in their environment. Finally, companies are facing challenges around IT resource management. To be able to successfully implement and transform new technology, you must have the right expertise, training, and knowledge of new technology. That’s not always easy to come by.

A lot of those challenges have to do with IT staffing. What can companies do to prepare their staff to manage digital transformation?

Dee: As companies move towards digital transformation, they’re requiring their IT staff to be more strategic with their projects and initiatives, while making sure their day-to-day operations run smoothly. That’s where Residency Services plays an instrumental role. With Residency Services, the customer is getting highly skilled professionals with deep product expertise in our storage, server, and networking portfolio. They work as an extension of the customer’s staff, helping them transition to their new Dell EMC technology faster and optimizing their data center. This gives our customers valuable time back in their day to focus on the innovative projects that are critical to reaching their revenue goals.

What kinds of results are companies seeing from customers who use Residency Services?

Dee: Recently, IDC conducted a study that showed organizations are getting significant value from Residency Services, both in terms of improving their IT organizations and overall business performance. In fact, nearly 98% of those surveyed recommended Residency Services. The study also found that businesses who use Residency Services:

  • Improved IT staff adoption and use of technology by 56%
  • Reduced number of problems by 54%
  • Improved overall technology performance by 53%
  • Improved overall IT staff efficiency by 49%

David: The great thing about Residency is that it’s different from traditional support and deploy services. It functions as an extension of the customer’s staff. Customers have the flexibility to change their Residency Services engagement as their business needs evolve.

What do you tell customers who are interested in bringing a Resident into their organization?

David: You need to take a holistic approach. Most of our customers aren’t just looking to solve a single point-in-time problem. They’re trying to take a journey. Our standard services for support and deployment are a great foundation, but Residency can help with the entire IT lifecycle. They help customers transition from a pre-production state, into a production state. They can also help customers by operating the infrastructure, allowing the customers’ IT staff to focus on more strategic projects.

Want to learn more?

Check out Residency Services on DellEMC.com.

The post Are Your IT Resources Struggling to Tackle Digital Transformation? appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/david_mensing/resources-struggling-tackle-digital-transformation/feed/ 0
Demystifying Software-Defined Networks: A Decade Later, Where Are We Now (Part I)? https://infocus.dellemc.com/javier_guillermo/demystifying-sdn-a-decade-later-where-are-we-now/ https://infocus.dellemc.com/javier_guillermo/demystifying-sdn-a-decade-later-where-are-we-now/#comments Wed, 27 Jun 2018 09:00:15 +0000 https://infocus.dellemc.com/?p=35594 The joke goes around that the true meaning of Software Defined Networking (SDN) is “Still Don’t Know.” SDN is a technology that allows network administrators to no longer be reliant on static architecture of traditional hardware/networks, but freed to centrally and dynamically manage the network via open, programmatic interfaces. This is accomplished by separating the […]

The post Demystifying Software-Defined Networks: A Decade Later, Where Are We Now (Part I)? appeared first on InFocus Blog | Dell EMC Services.

]]>
The joke goes around that the true meaning of Software Defined Networking (SDN) is “Still Don’t Know.” SDN is a technology that allows network administrators to no longer be reliant on static architecture of traditional hardware/networks, but freed to centrally and dynamically manage the network via open, programmatic interfaces. This is accomplished by separating the control plane (the system that decides where the network traffic will go) from the data plane (the systems that forward the traffic onto their destination).

Introduction – The Journey

Although we can trace the origins of SDN to the early 2000s, we’ve just reached the decade mark when the creator of the controllers (Ethane) first released Openflow. In 2008, we saw the birth of NOX and the first Openflow Switches. Later on, we saw the entry of Nicira (acquired by VMware in 2012, now part of Dell Technologies), the creation of Google’s B4 project, and the Open Network Foundation (2011) and the development of ONOS.

He Said, She Said

Looking to the present, we see that with SDN, similar to many new emerging technologies of the past, seems to have developed two well-defined perspectives at opposite extremes. On one side, you have people who focus on the negative aspect and believe that SDN hasn’t lived up to the hype over the last decade. On the other side, you have folks who believe SDN will continue to grow and will explode in the 2020s, making bold predictions on market share and expansion rates[1].

But where is SDN really, now that a decade has gone by?

We all know that SDN is complicated, but is it an acronym for ‘Still Don’t Know?’

I believe we are somewhere in the middle of these two extreme perspectives. It’s true that we still have many experts claiming that NFV is not going to be good enough for high performance applications, that you will always have better performance on dedicated HW/SW. But then again, not everything that is going to be virtualized requires the absolute best performance and the priority is how easy it is to deploy, automate, reuse and its resulting CAPEX/OPEX savings.

On the SDN side, there were plenty of controllers five years ago and all efforts were being centered on Open Daylight (there are many commercial controllers based on this architecture, such as Floodlight, ONOS and NSX). The consolidation is good, but has it lived up to the hype that started in 2012-2013? Has it been massively adopted and implemented everywhere?

Not quite.

In the words of Diego Lopez, chair of ETSI NFV ISG[2] and a senior technology expert at Telefonica, “Perhaps we rushed to commercialize NFV (and SDN) before we laid down the proper foundations and principles for the technology. We can all agree that the initial timeframes given six years ago were too optimistic and will need to be more realistic in the future, and there are still numerous problems left to be solved.”

The main issue customers face with technology adoption is the lack of standardization, especially on the higher layers and interoperability. We’ve come a long way from just having SDN in research centers and academia at top tech universities, to several practical massive implementations in the “real world.”

#1 Deployment Example

The key deployment was spearheaded by Google[3], which manages one of the largest (if not the largest, with the permission of AWS) enterprise and cloud deployment in the world. It is no secret that Google has been a strong supporter of Open Daylight and Openflow, while continuing to encourage other controllers like ONOS. According to Amin Vahdat “the fundamental design philosophy was that the network should be treated as a large-scale distributed system, leveraging the same control infrastructure we developed.”

The company strategy is based on four pillars:

  1. Jupiter is a data center interconnect capable of supporting more than 100,000 servers and 1Pb/s of bandwidth.
  2. B4 WAN interconnect is Google’s implementation of SD-WAN[4] which constructs B4 to connect its data centers to one another and replicate data in real-time between individual campuses. “It’s built on white boxes with our software controlling it,” said Vahdat.
  3. Andromeda is a network functions virtualization (NFV) stack that delivers the same capabilities available to its native applications all the way to containers and virtual machines running on the Google Cloud Platform.
  4. Espresso, not the coffee (!), is the fourth and perhaps more challenging piece of the strategy. It extends SDN to the peering edge of Google’s network where it connects to other networks across the planet (Google exchanges data with ISPs at 70 metros and takes up to 25% of all internet traffic). The Espresso technology allows Google to dynamically choose from where to serve individual users based on measurements of how network connections are performing in real time. Espresso allows us to maintain performance and availability in a way that is not possible with existing router-centric Internet protocols, separating the logic and control of traffic management from the confines of individual network boxes. This translates to higher availability and better performance through Google Cloud than is available through the Internet at large.

Espresso improves user experience on two fronts: Firstly, it automatically selects the best data center location to server a specific tenant/user, based on real time statistics and formulas. Secondly, it separates the logic and traffic control from the individual routers, following the tenant of SDN of the separation of planes. A single distributed system with an aggregated vision of the overall network will always perform and make better decisions than an individual hardware router.

Summary

There is no clear agreement in the question ‘did SDN live up to the hype a decade later?’ Its adoption has been growing steadily but hasn’t replaced “traditional” networking yet. Once consolidation/ standardization and interoperability within the higher layers is achieved, its adoption may be unstoppable and major implementations like Google’s may become the norm.

Our team of expert Dell EMC Services consultants and advisers, as always, are here to answer your questions and advise a solution that best fits your needs.

Sources

[1] Software Defined Networking: Technology and Global Markets

[2] ETSI Standards

[3] Espresso makes Google cloud faster, more available and cost effective by extending SDN to the public internet

[4] SDxInsights on SD-WAN

Blog Series

Demystifying Software Defined Networking Part III: SDN via Overlays

Demystifying Software-Defined Networks Part II: SDN via APIs

Demystifying Software-Defined Networks Part I: Open SDN Approach

 

The post Demystifying Software-Defined Networks: A Decade Later, Where Are We Now (Part I)? appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/javier_guillermo/demystifying-sdn-a-decade-later-where-are-we-now/feed/ 2