Colin Sainsbury – InFocus Blog | Dell EMC Services DELL EMC Global Services Blog Tue, 23 Apr 2019 19:12:50 +0000 en-US hourly 1 Unified Endpoint Management: One Tool to Rule Them All? Mon, 04 Feb 2019 10:00:52 +0000 Just recently a lot of the buzz in the end user computing world has been around moving to unified endpoint management. As with many concepts in IT, unified endpoint management, or UEM for short, is defined more by marketing departments than any rigid scientific or legal method. It is the latest step in a journey […]

The post Unified Endpoint Management: One Tool to Rule Them All? appeared first on InFocus Blog | Dell EMC Services.

Just recently a lot of the buzz in the end user computing world has been around moving to unified endpoint management. As with many concepts in IT, unified endpoint management, or UEM for short, is defined more by marketing departments than any rigid scientific or legal method. It is the latest step in a journey that endpoint device management has been on for a while, namely the convergence of client management tools (CMT), mobile device management (MDM) and enterprise mobility management (EMM) toolsets.

The challenge is that the definition of UEM is governed by the participants of the conversation. The definition from Wikipedia (derived from Gartner) is probably the best that I have seen:

“Unified Endpoint Management is a class of software tools that provide a single management interface for mobile, PC and other devices. It is an evolution of, and replacement for, mobile device management (MDM) and enterprise mobility management (EMM) and client management tools.”

The Gartner paper is behind their paywall but VMware has made the entire Magic Quadrant paper available for download for free.

VMware Workspace ONE | Source Gartner, June 2018

The Unified Endpoint Management definition above shows the convergence of the toolsets used for mobile devices (typically MacOS, iOS and Android) with those used for Windows. This is a reflection both of the growing importance of the first category of devices within the workplace and the move by Microsoft to include the Open Mobile Alliance – Device Management protocol within Windows 10.

Furthermore, the UEM toolsets are typically cloud-hosted, although some have on-premises variants for those more cloud-averse organisations. This cloud hosting delivers two key benefits:

  1. There is no infrastructure to design and maintain. The software vendor provides you with a tenant and keeps adding patches and new features to it.
  2. These days, devices work outside of the corporate network more often than inside. A cloud-hosted solution means that devices can be managed wherever they are operated without relying on the users connecting to the mothership via VPN.

Organisations have typically been running multiple tools to address these device communities, but this adds complexity to what is already a complex environment. The goal of UEM is to create one tool to rule all the device communities.

The question that needs to be answered is:

Has Unified Endpoint Management reached its Digital Photography moment yet?

This may seem like an obscure question, but reading this blog by Steven Sinofsky caused me to take stock of my mindset regarding UEM. He used the example of the transition from silver halide film to digital photography. In the blog he described the technical buzz saw that devotees of the incumbent technology use to dismantle the challenger technology based on very specific and clearly defined limitations. He argued that over time, the challenger technology closed that gap. In addition, whole new workflows were invented that changed the face of photography.

I have been guilty of wielding that technical buzz saw regarding the mainstream UEM toolsets, targeting what I perceive are their shortcomings:

  • Inability to deliver a bare-metal build
  • Deployment of Win32 applications
  • Transfer of work from deployment engineers to non-IT staff

However, having read Steven’s post I revisited my thinking, looking at things from the other side of the argument.

Inability to Deliver a Bare-metal Build

UEM tools recognise that every device is shipped from its vendor with a perfectly good operating system including drivers for the subcomponents. We do not need to deploy one before we use the device, we simply need to configure the current one to meet our needs. This is the thinking behind Microsoft’s Autopilot process.

You may be thinking that Windows devices are often shipped with trial versions of software as part of the factory installed image and that you do not want that adding to the support complexity. Therefore, Dell Configuration Services recommends our “Generic Image” option, without any trial software, in conjunction with Microsoft Autopilot registration. This provides control over the version of Windows 10 installed and ensures a known clean base to begin UEM control from.

Those with one hand still on the buzz saw will point out that most vendor support processes will replace a failed hard drive with a new “clean” drive without an Operating System. However, as Sinofsky says, “Most problems are solved by not doing it the old way”.

Three mitigations come to mind:

  1. The move to thinner, lighter devices has driven the proliferation of solid state storage solutions which are less likely to fail.
  2. Organisations can change their internal support processes to include a pool of devices to swap with any failed devices, thereby maintaining user productivity. The failed device is repaired and returned to the swap pool.
  3. Once critical mass is achieved, vendor-support processes may move from a repair to a swap out policy.

Addressing this inability to deliver a bare-metal build is unlikely to be resolved by the software and is therefore one area where a mindset change may be the best route.

Deployment of Win32 Applications

This highlights how the march of technical development erodes the arguments presented by the devotees of the incumbent technology. The mainstream tools at the heart of UEM are typically mobile device management tools which were designed to deliver applications to mobile operating systems (Android and iOS).

The design specification would therefore have provided for delivering relatively small applications (a few hundred megabytes) which are simple in nature and without the need for dependency checking. Delivering Win32 applications to Windows 10 devices requires a more sophisticated capability. This capability is evolving, with the two vendors that Gartner sees as the leaders (Microsoft and VMware) in a race to bring this capability to market.


VMware was first to market with the ability to deliver Win32 apps. Their capability can deploy msi, exe and zip files and differentiates between applications and their dependencies. Additionally, VMware has released their AirLift connector which connects Configuration Manager (ConfigMgr) to your Workspace ONE tenant and enables you to export the applications from ConfigMgr and import them into Workspace ONE without the need for repackaging.

This approach makes it easy to transfer the content and assignment metadata into Workspace ONE and will help customers who wish to move away from ConfigMgr in the long term. Based on my customer experience, ConfigMgr is the most widely deployed toolset, however, we are increasingly seeing customers with Ivanti LANDesk and IBM BigFix who would like to have a similar capability to help them move. It is to be hoped that the Workspace ONE engineering team can create a similar capability to assist them.

There is an additional benefit to following the Airlift enabled route. Once the applications have been moved into Workspace ONE, Dell Configuration Services offers Factory Provisioning services. I have described this in more detail in a previous post entitled, Windows 10 Migration: Best Practices for Making a User’s First Impression Great. In summary, this enables our customers to provide us with bundles of applications, including Win32 apps, for loading in our factory, thereby streamlining the deployment of their new devices using Workspace ONE.


Microsoft announced at Ignite (September) 2018 that their capability would shortly be made available as part of a public preview.  At the time of writing, this facility is still in public preview and being rolled out to Intune tenants. Applications need to be converted from their current format to the new .intunewin format. This process is enabled using an upload prep tool but seems to involve significant manual data entry.

Microsoft may well feel that they have this area covered by ConfigMgr which has been the mainstay of application deployment for many customers for years. Indeed, part of their strategy is to use Intune to automatically deploy the ConfigMgr client. This gets around the limitation that a user with standard permissions would not be able to deploy the ConfigMgr client themselves.

This approach then means that the device is now in a state of dual or co-management where control is achieved using two tools. The working premise is that these tools work in concert and provide a low risk approach to transitioning from ConfigMgr to Intune one workload at a time.

Over time, applications are moving from locally deployed to software as a service or web-based. As this happens, we reduce the reliance on Win32 applications and this problem diminishes.

Transfer of Work from Deployment Engineers to Non-IT Staff

This is the key challenge for me when trying to adopt the change mindset. For years we have been delivering fully built systems to our users to minimise their downtime. In part, I suspect that this was because we were catering for less technically trained users that we have today. It was also to cater for the fact that to meet security guidelines, users were given low privilege accounts which meant that they were prevented from completing the activities even if they were willing to do them.

The introduction of solutions such as Microsoft Autopilot mean that they no longer need high privilege accounts to do key tasks. However, devices are delivered to them with few if any applications included. As described in the Windows 10 Migration post, the deployment of applications to the device can take a while. In the past this was done on the build bench and so was hidden from the users as it was in what I call Engineer Time.

If application deployment is done after device receipt by the user, it is now in User Time. This has two implications:

  • The device is not ready for use immediately, potentially preventing a user from working
  • Simply transferring the work from the IT team to the users does not make it more cost effective

Let’s break each of those items apart and examine them.

Device Not Ready for Use Immediately

New Devices

Most devices deployed using an Autopilot method today will be replacement devices, where an existing user is getting a new device. Traditionally, they have been asked to handover the old device on receipt of the new one. Using the applications pre-provisioning methods described above may be sufficient to ensure that the device is fully ready at handover. If there is some further time required, briefly delaying the return of the old device, will allow them to work on it whilst their new device finalises. This effectively negates the impact of the delay, as the user can login to their new device and allow processes to complete before relinquishing the legacy device.

Dell is investing heavily in technology and processes which will enable it to move more and more of this pre-provisioning work upstream into our factories. We are engaged with both Microsoft and VMware to look at ways to improve the day one experience of your colleagues by automating as many of the task involved in deployment as possible.

Existing Devices

Where an existing device is being upgraded to Windows 10 from an earlier operating system, there are two approaches that will be used: In-place upgrade or wipe and load. An in-place upgrade simply updates the operating system and migrates the data and applications as is. There is no impact here.

Wipe and load upgrades require a bare-metal build process and therefore require a toolset such as ConfigMgr. It is now possible to create a task sequence to perform a wipe and reload process which then sets the device to use Autopilot when the device is handed back to the user, but if the device not being ready for the user is a consideration then this would not be the route to take. If performing a bare metal build, it is more likely that the device will be handed back by an engineer fully ready for the user.

Transferring the Work from IT to the Users

New technology often results in the adoption of new or changed working practices. Before computers became standard issue, firms employed banks of typists to turn a manager’s thoughts or dictated words into formal output. No doubt somebody pointed out that asking a manager to type their own documents was less cost effective than asking a typist to do so. However, time moved on and the user empowerment that came from avoiding the need to dictate the content, saw the widespread adoption of the new way of working.

We are on the cusp of a similar change in end user device deployment. My conversations with IT departments are increasingly focused on user empowerment rather than the IT team owning the tasks. Clearly, there are employees within the organisation who earn significantly more than the deployment engineers, but do they prefer being able to get the task done rather than organising a time to meet with an engineer?

There is no definitive answer here, some will want the job done for them whilst others just want to get the job done even if it means doing it themselves. In a way that sums up where we are with unified endpoint management as well.

Dell’s viewpoint is that the best experience comes not from moving work from deployment engineers to users but by increasingly automating the tasks we remove the need for human intervention entirely. The analogy we often use is the comparison between visiting your bank to withdraw cash. You can visit the human teller who will give you the full in person service or you can visit the automated teller (ATM) which for most of us is convenient and a better experience.


For some organisations, typically those with a highly mobile workforce, the scales are going to be tipped in favour of one of the UEM approaches. For those whom the pace of workforce transformation is a little slower, they may still be happier with the traditional methods for now but over time they will still find themselves drawn to UEM in the end.

The point is that there is an opportunity to try something new and see whether it has reached the tipping point for you. Are we at the point where the opportunities offered by the new tools and processes enable you to do things of higher value than the things that they currently cannot?

The UEM tools available today are not the whole story. They need to be combined with pre-provisioning and factory services to ensure that work is not simply transferred from one team to another but replaced by automation. This is where the Dell Technologies value comes in. Working with both Microsoft and VMware we are pioneering ways to automate the provisioning processes and drive the most value out of the shift to UEM.

As the focus shifts towards user experience in the ongoing battle to retain key staff, it is likely that organisations will look to deliver more user empowerment through a better understanding of their user environment. Dell EMC has developed a series of tools and techniques described in the free eBook, 4 Tools and Techniques to Create Change and Empower the Workforce with Personalized Experiences, to help you meet the needs of an ever more demanding workforce. Key to this approach is the development of user personas and a detailed knowledge of the user profile. All of this data feeds the UEM tools to make for a better initial experience.

If you are ready to ditch the silver halide film and join the digital workforce transformation, please feel free to contact your Dell EMC Sales Representative to discuss how we can help you.

The post Unified Endpoint Management: One Tool to Rule Them All? appeared first on InFocus Blog | Dell EMC Services.

]]> 0
Windows 10 Migration: It’s All About the Apps Wed, 07 Nov 2018 10:00:05 +0000 In previous posts, I have looked at the use of traditional and modern management solutions to provision devices and how these techniques are suited to different categories of users (persona groups) within your workforce. I then discussed how important it is to make the best first impression and how that can be achieved by optimising […]

The post Windows 10 Migration: It’s All About the Apps appeared first on InFocus Blog | Dell EMC Services.

In previous posts, I have looked at the use of traditional and modern management solutions to provision devices and how these techniques are suited to different categories of users (persona groups) within your workforce. I then discussed how important it is to make the best first impression and how that can be achieved by optimising data and applications migration. In this post, I will be addressing one of the hottest Windows 10 topics my customers ask for help with, namely preparing the applications for migration.

Preparing applications for migration is a multi-faceted challenge and with the introduction of Windows as a Service, it needs to be revisited for each Windows 10 version upgrade. The stages required to complete one pass of this process are as follows:

  1. Identify
  2. Rationalise
  3. Assess
  4. Package/Remediate
  5. Pre-provision/Deploy
  6. Support

Whilst addressing each of these steps on a bi-annual or annual basis to keep pace with Windows 10 releases may seem daunting, putting the effort in on the initial migrations, and keeping the data up to date, will mean that subsequent updates are much less onerous. Adopting this rigour will yield a controlled and managed environment with valuable information and security benefits.

1. Identify

This is a two-fold task. The first is to create an inventory of all the applications currently running in your estate. Note that I did not qualify them as authorised applications yet. Many customers have suffered with application sprawl either because they have multiple versions of the same application running or because too many people have elevated privileges and have been self-installing “useful” software.

The second part of the task is to build a matrix that shows which individuals use which applications. Once you have rationalised the list, you will need to create a matrix detailing the applications for each user, including the substitutions resulting from the standardisation. This matrix, once completed will be your guide when you come to pre-provision or deploy applications in step five.

Customers address the identification challenge in several ways. Some have already implemented inventory and asset management software such as Snow Software Inventory, to monitor their environments and assist with license compliance. Others will turn to Liquidware Stratusphere UX or Lakeside Systrack as part of their desktop transformation process. Both solutions require investment both in terms of licensing and implementation costs.

Microsoft has recognised the need for this information too. They are keen to gather as much telemetry data about the global Windows 10 estate as is possible. They also understand that enterprises are only likely to connect their systems and allow this data flow if they receive something compelling in return.

Microsoft have therefore created the Desktop Analytics portal. Initially released as the Upgrade Readiness toolkit, features and capabilities have been added which justify this rebranding. There are no license fees associated with the tool although, as with everything, there are implementation costs. These are derived from the need to turn on the telemetry on Windows 7 SP1 or later devices as well as making firewall rule changes to enable that telemetry to flow to the portal.

2. Rationalise

Whichever route you choose to identify your applications, you will be presented with a list that will include more lines than you hoped for! As every step from hereon in costs money calculated on a per application basis, the more effort you expend rationalising the applications, the tighter your cost control will be. Opportunities for rationalisation are as follows:

  1. Standardising on one application to do a task. For example, do you really need three different dedicated PDF readers when Chrome or Edge will do? Even “free” applications cost money to support.
  2. It is not unusual to find 5-10 different versions of the same application where the patching and management solution has not been keeping pace. Standardising on one version will reduce the size of your list to a more manageable one and improve your security posture.
  3. Business justify every application in the list – seize this opportunity to reduce cost and security risk.

All this guidance must be balanced by demonstrable business need. Your organisation may need tools that other entities do not. Some applications may only work when paired with certain versions of supporting add ins etc. but in general the advice holds. Furthermore, your information security team may have views on the use of legacy software or some of the applications that are harder to justify.

Talking of the security team, now would be a good time for them to re-evaluate whether the security features built into Windows 10 will meet their needs instead of third party applications. This will of course depend on whether you are implementing the Enterprise SKU or not. The business cases for continued reliance on third party disk encryption and anti-malware provision, for example, are often reassessed at this stage.

3. Assess

Microsoft cite very high levels of application compatibility (more than 90%) but the caveat is that this is for commercially available software. The compatibility of those in-house applications that are business critical, will, of course, depend on how well your development team stuck to the rules.

So what options exist to quantify the risk?

Microsoft’s Desktop Analytics will help you with the commercial off-the-shelf (COTS) software. It can inform you that version X of the application is already running elsewhere by someone else who is providing the telemetry data. The more people that use the telemetry-based system, the more informative it gets – we are all, after all, in this together. The telemetry can also tell you about Office add-ins and their compatibility with Office 365 Pro Plus as well.

The challenge with telemetry-based systems is that someone else must be running your software to give you a reading. However, in-house software tends to be just that and therefore there is no one else to rely on.

So what alternatives do you have?

The first option is to manually test the software on a build of Windows 10. This is labour intensive and will only test the components of the product that the test script exercises. With a small number of applications to test, this may be acceptable. However, most people reading this blog probably have more than one or two applications to test, so what then?

Option two is to use an automated testing solution such as Citrix AppDNA or Flexera Test Center. Extending this, many organisations, including Dell EMC, offer services based on these tools that enable you to outsource the testing process and any subsequent remediation and repackaging tasks. These tools and services do not address underlying code issues, they simply use application packaging processes to address any incompatibilities.

For example, many customers will be standardising on 64-bit Windows 10 systems as part of their upgrade, if they had not made this change earlier. 64-bit Windows cannot run the 16-bit code that was often present in 32-bit applications, meaning that the application will fail. The test products and services mentioned above can detect this incompatibility, but no amount of re-packaging can fix this, it requires code changes.

Typically, output from the testing tool is in the red-amber-green format. Red indicates a failure that may well need a new version. Amber indicates that there is a way to remediate the package and green suggests that the package is ready for user acceptance testing. The only deviation from this is testing for 64-bit compatibility, which is either red or green – there is no halfway here.

4. Package/Remediate

Having rationalised your application estate, you may well have decided that you are going to continue to use “ApplicationX” but you have previously been unable to upgrade to its latest version as it was only compatible with Windows 10, or perhaps you have made the switch to the 64-bit version of it. To deploy it widely, you will need to package it so that your Electronic Software Distribution (ESD) tool can deliver it.

This has several benefits:

  1. It will always be installed in the same way by the tool.
  2. It does not require an engineer to touch each device.
  3. The tool can be used to update the application, reducing the likelihood of application version sprawl and the inherent security vulnerabilities that brings.
  4. The users can be allowed to initiate the install from an application store without raising a helpdesk ticket, subject to licensing constraints.
  5. The tool can be defined as a Trusted Installer when using Windows Defender Application Control enabling you to further improve your security posture.

Applications are typically packaged as MSIs for tools such as Microsoft’s ConfigMgr, although some will be sequenced for use with application virtualisation solutions such as Microsoft’s App-V or VMware’s ThinApp.

There is a new kid on the block, MSIX. This is Microsoft’s attempt to make application installation as simple on a Windows device as it is on a mobile device. Support has been added to ConfigMgr Current Branch build 1806 onwards. This format looks to have some good attributes and may well be important for the future.

As suggested in the Assess section above, remediation of amber applications takes the form of modifying the existing package, perhaps to change a setting, or to add a helper file to make the application compatible with Windows 10.

Engaging with Dell EMC’s Global Applications Packaging team, will deliver a service that can assess, remediate existing packages as well as creating new packages (MSI, App-V and ThinApp today with MSIX coming soon) all on a per application basis. Our packaging service works to a greater than 95% first time right packaging and we offer a warranty to address any issues that our team might introduce.

Finally, Microsoft have recently announced their Desktop App Assure programme via FastTrack where they offer to help customers struggling with application compatibility including those tricky line of business applications where you no longer have the source code. They have even made the shortcut easy to remember:

5. Pre-provision/Deploy

I have already discussed the benefits of using an ESD tool such as ConfigMgr to deploy your applications and in my previous post, Windows 10 Migration: Best Practices for Making a User’s First Impression Great, I discuss the benefits of pre-provisioning and how Dell EMC can help you with our factory configuration services using either ConfigMgr or Workspace ONE.

To recap, application deployment via a tool such as ConfigMgr or Workspace ONE gives you the ability to centrally control and manage your application estate, minimising application sprawl, whilst keeping things up to date to enhance your security posture. It also enables your users to install applications using an application store in similar fashion to the way they do on their mobile device.

Pre-provisioning is the deployment of the applications to the device prior to it being issued to the user. This can save bandwidth and will certainly enhance the day one user experience. However, it relies on the user application matrix discussed in the Identify section above. This then needs to be captured into the tooling such that it knows which applications are used by whom. Finally, it is necessary to assign the device to its new owner at build time, which may not suit every organisation.

6. Support

Support is critical. Software applications, presented as packages, represent an asset or configuration item. They will inevitably need patching and updating, as business needs change, or as the vendor fixes an issue or releases new functionality. With the change to the way Windows 10 is released (Windows as a Service), this cycle will be ongoing.

Windows as a Service

Windows as a Service is Microsoft’s term for the twice-yearly release of Windows 10 builds, typically released in spring and autumn. It was clear that new features would be added with each new release, but less well understood was the potential for features to be removed.

This removal is usually triggered by the lack of interest in the capability – an example recently (1809) was the removal of Business Scanning (AKA Distributed Scan Management) as there was no hardware vendor support for it. Alternatively, a feature can be superseded such as the Hologram app which has been replaced by the Mixed Reality Viewer.

This addition and removal of features can impact application compatibility, meaning that the roll-out of each build of Windows 10 should be considered a mini-migration. It is not as daunting as the initial move from Windows 7 to Windows 10 but still worth checking those business-critical applications.

As mentioned at the start, doing all of this on a bi-annual or annual basis might seem a challenge. However, doing the hard work the first time you enter the Identify and Rationalise phases and then keeping the information up to date will allow you to become more efficient with each passing cycle.

The Dell EMC Services team can also undertake to do this lifecycle management for you. We can address both Windows 10 image changes and applications (compatibility and vulnerability) testing and remediation for you, every six months, to help you keep your environment up to date.

To find out more about this or any of the other services discussed in this post, please feel free to contact me.

The post Windows 10 Migration: It’s All About the Apps appeared first on InFocus Blog | Dell EMC Services.

]]> 2
Windows 10 Migration: Best Practices for Making a User’s First Impression Great Mon, 08 Oct 2018 09:00:55 +0000 First impressions count! This is the third in a series of blog posts looking at the enhancements and issues that our customers will experience when migrating data and applications to Windows 10, a process which is truly a transformation, a move to modern IT management. I’ll provide best practices to overcome respective challenges and also […]

The post Windows 10 Migration: Best Practices for Making a User’s First Impression Great appeared first on InFocus Blog | Dell EMC Services.

First impressions count!

This is the third in a series of blog posts looking at the enhancements and issues that our customers will experience when migrating data and applications to Windows 10, a process which is truly a transformation, a move to modern IT management. I’ll provide best practices to overcome respective challenges and also suggest a means to making a user’s first impression of Windows great.

The Very First Impression on the Windows 10 User: Shorter Boot up Time with SSD

Little things count with first impressions and there are good ones with Windows 10, beginning with the moment the user turns on his or her device. The speed of boot up time is remarkably shorter due to the dramatic improvements Microsoft has employed to make Windows 10 run fast with Solid-state Drive (SSD). Solid-state drives are also much more reliable, meaning less downtime due to failures.

The next set of favourable impressions rely on ensuring all the user’s data and applications have been migrated to the new device. This sounds simple, but in practice the latter is proving to be the more difficult challenge although there is much commonality between the two.

Migrating Data to Windows 10: EFSS and OD4B

Most organisations are moving to an Enterprise File Synch and Share (EFSS) solution such as OneDrive for Business (OD4B) and looking to use it as part of the migration process. In theory, EFSS makes it easy for the user and less work for the migration team. The user logs in on their new device, configures the synch client and the data starts to replicate. The difficulty arises from the volume of data that we each store today and this will define the time to complete the synchronisation.

All organisations have pockets of low bandwidth and even in well-connected offices, volume rollouts can put pressure on the links when the number of users simultaneously synchronising, reduces the available bandwidth.

OD4B addresses the low bandwidth issue by allowing users to partially synchronise their data with the local machine but they need to be educated to use this capability carefully.

Synchronised files are available offline but those that are yet to be synchronised can only be accessed when the user is online. Partial synchronisation offers the benefits of a reduced time to complete and less space used on the device but forces the user to connect to get a cloud-only file.

Migrating Applications to Windows 10: Configuration Manager

Applications are a tougher challenge to migrate and it will depend on whether you have chosen to Shift Left or Right as to how you address this. Broadly speaking, the challenges are the same, but the toolsets are different. Most of our customers are still using a tool such as Configuration Manager (ConfigMgr) to build their devices. I will talk about the differences seen with UEM toolsets such as Intune and Workspace One in the following section.

In most current systems, the device is builtby a Task Sequence and brought under ConfigMgr management. So far, so good, but how do we ensure we have all the user’s applications on the device ready for them?

To address this question, we first need to know:

  1. Which applications are in our estate today?
  2. Which of those applications are authorised to be in our estate?
  3. Of the authorised applications, which versions are Windows 10 compatible?
  4. Do we have a package containing our Windows 10 compatible version of each authorised application ready for distribution?

Answering these questions affirmatively means we now have a library of applications ready for our users, but one question remains, and this is often the challenge for customers – which application is used by whom?

Applications Installed Manually versus Technician or ConfigMgr

Whether applications are installed manually by a technician or an automated process using ConfigMgr collections, it is only possible to know that the job is done, if you truly understand what the task called for. Furthermore, it needs to be an accurate list provided ahead of the deployment rather than asking the user two days before the anticipated device handover. This is because the list of required applications needs to be cross-checked with the answer to Question 4 above.

If we have this list, the best way to make this work is to code the detail into ConfigMgr and use it to deploy applications to a specific device.

Whilst it is possible to target application deployment to the user, rather than the device, this would mean that the applications are deployed once the user logs on for the first time which is the type of experience we are hoping to avoid.

Targeting the device however, means we need to make the device user specific from the moment that it is first built. It is a workable approach when deploying in small quantities. For volume deployments, it is likely that the additional time for the deployment engineer/tech bar staff searching for the specific device for that user, will rival that of the user deploying their own applications.

As a result, many organisations will choose to go to line of business or departmental application level rather than full user specificity to find the best balance between cost and benefit. User specific applications can be self-installed using the Software Center component in ConfigMgr. This approach means that the user can get started immediately, whilst their specific applications are installed in the background.

Dell’s Connected Configuration Service: Making a User’s First Impression of Windows 10 Great

Dell offers our Connected Configuration Service to enable customers to extend their ConfigMgr environments via a VPN into our logistics chain. This means that devices can be built using your task sequence, join your domain and applications be deployed as they would be using an onsite build facility. Once prepared, the devices are re-boxed and delivered to their new owner, using our logistics team.

Figure 1: Dell’s Connected Configuration Service

The Shift Right Approach: Impact of Unified Endpoint Management

For those that have chosen the Shift Right approach, the applications set will now be delivered via their Mobile Device Management (MDM) toolset of which the two main players are Intune and Workspace One (formerly AirWatch). The industry is moving to the terminology of Unified Endpoint Management (UEM) to denote that the toolsets have matured to allow both mobile (smartphone) and PC management by the same toolset.

Regardless of the chosen tool, applications will be targeted to the device once the device enrolment process has completed and the device has been assigned a profile in the tool. In this case, the equation used for data can be rewritten as:

As many desktop applications are of significant size and some users need many applications, the time to complete can often be measured in hours. This is where aligning the application distribution approach and user persona becomes important. In How to Modernize Your PC Management Approach, I argued that UEM tool sets were best suited for those users with the lightest on device application requirement i.e. if they rely most heavily on Software as a Service (SaaS) or web applications, this is less of an issue.

Inevitably users will still require additional locally installed applications and we would prefer to preinstall those before the user gets the device to ensure the best first impression.  If the tool is only able to distribute applications after the enrolment process has completed, but we need to deploy applications to a device before the enrolment process starts – how can we break this log jam?

The answer lies in Dell’s ability to preconfigure devices before we ship them.

Dell’s Dynamic Imaging: Making a User’s First Impression of Windows 10 Even Greater

Dell offers our customers the ability to ship devices with a customer specific build preloaded onto them; a process we call Dynamic Imaging. Dynamic Imaging applies an image to the disk and injects into that image the driver pack for that device. This process enables us to support customers who want to maintain a single image for multiple hardware variants.

Using Dynamic Imaging, customers can include common applications that apply to all users for example security tooling, Office and PDF reader applications. In the past, customers made this image very application rich to minimise the impact of installing user applications on their network. However, the image became bloated and difficult to manage. We therefore guide our customers to keep this image as lean as possible.

So how is it that we meet our target of preinstalling user applications?

Here at Dell, we regularly talk about the Dell Technologies Advantage, which is where different brands within the family come together and the result of that collaboration is a real customer benefit. In this case, our Configuration Services team have worked with the Workspace One part of VMware to bring forward a solution to the application pre-provisioning problem.

Applications, or groups of applications, that are prepared for delivery via Workspace One can be exported to a PPKG file from the tool. The tool also provides an interface to build a Unattend.xml file to allow automated on premises (AD) domain join and enrolment with Workspace One. The combination of the PPKG and the Unattend.xml file are then transferred to Dell via a secure FTP service.

Figure 2: Dell Configurations Services (Workspace One)

A Dell Configuration Services technician then boots the device, applies the Windows 10 build, drivers and the PPKG file and places the unattend.xml file on the disk. The device is then placed back into its shipping carton and delivered to its new owner.  When they receive the device, they now only need to install any applications from the Workspace One application store that they use over and above those specified in the PPKG file.

For example, the PPKG file might be department specific, but they may require two applications that no one else in their department uses. These applications can be installed by the user. Importantly though, the user can do most of their work whilst those applications are provisioned.

Dell and Your Device Deployment

Dell has industry-leading Configuration Services which can give your users the best first impression when they receive their new Windows 10 device, whether that device be delivered using ConfigMgr or UEM tools.

When these Configuration Services are combined with VMware Workspace One, the Dell Technology Advantage provides the best solution in the marketplace today to support the needs of your ultramobile users.

Figure 3: The Winning Combination – Configuration Services + Workspace One + Dell Technology Advantage

If this post has helped you formulate your best route to Windows 10, or if you have more questions, I would love to hear from you.

You may be interested in these other blogs:

Windows 10 Migration: Should You Shift Left, or Right?
How to Modernize Your PC Management Approach

The post Windows 10 Migration: Best Practices for Making a User’s First Impression Great appeared first on InFocus Blog | Dell EMC Services.

]]> 1
Windows 10 Migration: Should You Shift Left, or Right? Mon, 18 Jun 2018 09:00:03 +0000 The time has finally come when many organisations are taking on Windows 10 migration. The software was first released in July 2015 and has matured through six releases, bringing the stability that enterprises are looking for. Software vendors have had time to ensure that their applications are available in Windows 10 compatible versions. There is […]

The post Windows 10 Migration: Should You Shift Left, or Right? appeared first on InFocus Blog | Dell EMC Services.

The time has finally come when many organisations are taking on Windows 10 migration. The software was first released in July 2015 and has matured through six releases, bringing the stability that enterprises are looking for. Software vendors have had time to ensure that their applications are available in Windows 10 compatible versions. There is also the looming demise of support for Windows 7 due to the removal of hardware support or the need to migrate from it before the software support window closes.

Equally, user experience is now becoming a significant part of the conversation with any IT department. As developed economies achieve high levels of employment, there is more competition to attract the brightest and best talent. It is no longer just about the pay and benefits package, but now includes softer items like the type and mobility of devices and the inclusion of current applications.

Many organisations have built up significant technical debt within their organisations and their devices are now over four years old. They are therefore looking at large scale programmes to replace their end user compute estates and simultaneously migrate to Windows 10 on the new equipment.

The question becomes – how do they do this in the most appropriate way?

Shift Left

Traditional device replacement programmes have relied on a wave of upgrade activity propagating through the organisation. This usually took the form of buying new devices, shipping them to a staging location, perhaps provided by a third party, imaging them and delivering them to the end user. This is shown in the diagram below:

Figure 1: Typical cost breakdown of deploying a new device.

Many of the activities shown above can be more efficiently performed in the factory location using Dell’s Configuration Services capabilities. This approach removes the need for the staging location and a shipping leg, thereby reducing the carbon footprint of the organisation and contributing to their social responsibility goals.

Where staging centres have been used in the past, these have been connected to the enterprise network and the systems built there, rather than at the deskside. This usually involves extending automated OS deployment tooling into that location. Dell’s Connected Configuration solution, part of its ProDeploy Plus offer, allows enterprises to use our regional distribution hubs as their staging centres. The result is shown in the diagram below:

Figure 2: Dell’s Shift Left Optimisation

This approach saves time and money as the devices are imaged before they start their final delivery leg, arriving at the desk imaged and ready for the user to start work. Users only need to install their elective applications, those they use over and above the standard for their department. Providing that they have access to a self-service application store such as the Software Center included within SCCM, they can do this easily in a similar way to using their smartphone app store.

As this efficiency gain is driven from using the Dell factory and shifting activity left on the diagram, Dell refers to this as a Shift Left optimisation. We have the facilities to scale beyond those of most organisations in terms of throughput per day etc. Furthermore, our regional distribution hubs are situated such that all devices for a given territory will transit from that location anyway so it is much more efficient to use these locations than a third-party staging facility.

Data and settings migration may still require technical assistance, but these days more and more organisations are moving to Enterprise Files Sync and Share (EFSS) solutions such as OneDrive for Business, meaning that disciplined users only need to sign in to their tool and their data will start to be replicated to the new device.

Dell’s Managed Deployment teams can provide support for these deskside activities, should it be required, as well as providing resources to hand out the new devices and collect the legacy ones, dealing with packaging removal as part of the task.

Shift Right

Microsoft has been promoting the idea that Windows 10 can be provisioned (note: not imaged), using the AutoPilot process much more easily and at less cost than the traditional imaging methods. The idea here is that the device is shipped with a standardised OEM operating system image on it and a hardware device ID is registered with Microsoft to enable the AutoPilot process.

In practice, this transfers much of the work shown in figure 1 to the end user and is shown below.

Figure 3: The workload balance for Shift Right.

This is often marketed as user enablement or empowerment, but what does that mean in practice?

The end user is now responsible for:

  1. Collecting their new device from a tech bar or similar distribution location.
  2. Perhaps unboxing the device and installing it to their desk.
  3. Going through a tailored Windows 10 out of box experience (OOBE).
  4. Logging into the device with their Azure Active Directory credentials thereby “joining the device to the domain” and drawing down policy settings and triggering mandatory software installs (as defined by their organisation) such as Office 365 Pro Plus.
  5. Installing additional applications from the Mobile Device Management Application Store or the Windows Store for Business (WSfB).
  6. Signing in to their EFSS solution (OneDrive for Business etc.).
  7. Waiting whilst their applications are downloaded and installed from the application store or WSfB. (This could take a while dependent on bandwidth and size of applications.)
  8. Waiting whilst their data synchronises, again bandwidth and volume dependent.
  9. Starting work with those applications and that data that is available to them whilst the remainder is delivered in the background.

Should We Shift Left, or Right?

In a previous blog post, Managing PCs the Modern Way, I considered the prime candidates for each of these deployment methodologies. To briefly reprise that post, Dell describes five key personas:

  1. Desk Centric – greater than 50% of time spent at a fixed desk location.
  2. Corridor Warrior – collaborative worker that divides their time between desk and meeting room locations but in one office or campus.
  3. On-the-Go Pro – highly mobile individual travelling between locations, cities and often countries. Often works on public WIFI or mobile communications.
  4. Remote Worker – greater than 50% of time spent at home.
  5. Specialist Worker – in many cases this category is driven by the hardware they use and Dell has defined three subcategories of this persona. It could be a Field Worker who needs ruggedized equipment, an Engineer who needs hardware that enables them to be creative perhaps using CAD/CAM software or a Creative user with peripherals such as the Dell Canvas.

It is my belief that from a management perspective and therefore a deployment methodology, Shift Right, enabled by modern management, best suits the On-the-Go Pro population and Specialist users with a similar level of mobility requirement. Where Remote Workers who primarily use Software as a Service (SaaS) apps, they may also be candidates.

Those users who are classified as Desk Centric, Corridor Warrior and the proportion of Remote Workers who use customer datacentre-hosted applications are probably best handled using the Shift Left methodology. This view is based on the fit of the post-deployment management technologies to the ways these users work.

Having said that, is it as simple as that? When Shift Left includes an engineer to address deskside deployment, it is like a waiter bringing your food to the table and serving it to you. Using that analogy, Shift Right could be described as a self-service buffet.

Dependent on the type of individual that fits in to your On-the-Go Pro category, asking them to do self-service may or may not be appropriate. Some will see it as end user empowerment whilst others will see it as the IT department cutting costs and offloading the work on to the users. Do you need to provide a “Tech Bar”-style walk-up help facility to address issues rather than overloading your Help Desk? How does this impact any outsource contract you might have? How do you quantify the loss of productivity for what could be some well-paid individuals whilst they wait for their applications to download?

These issues are soft or non-technical, but need to be addressed before simply opting for what appears on the face of it to be the “cheaper” option.


Dell EMC’s Support and Deployment Services can help you identify the persona groups within your organisation using a mix of tooling and business analysis. Once we understand that mix and your readiness to adopt the modern management approach, we can help you define the roadmap to enable you to transform to your desired future state. In this way, we can help you find the right balance between Shift Left and Shift Right within your organisation. We will then stand with you through your deployment, bringing the relevant services to bear as you need them. The diagram below shows how we see these mapping against the personas, with those suiting Shift Left on the left and those suiting Shift Right on the right.

Figure 4: How Dell can assist with both Shift Left and Shift Right.

If you have any questions pertaining to Windows 10 migration or the Shift Left or Right methodologies, please feel free to contact me via the comments below.

Part of the Blog Series:

Windows 10 Migration: Best Practices for Making a User’s First Impression Great

How to Modernize Your PC Management Approach

The post Windows 10 Migration: Should You Shift Left, or Right? appeared first on InFocus Blog | Dell EMC Services.

]]> 0
How to Modernize Your PC Management Approach Tue, 12 Dec 2017 10:00:58 +0000 I meet a lot of customers and love to hear their thoughts on the future of the end user computing world as much as they love to hear my digest of what’s going on in the marketplace. Discussion as of late has revolved around an emerging trend –modernizing one’s approach to PC Management – a […]

The post How to Modernize Your PC Management Approach appeared first on InFocus Blog | Dell EMC Services.

I meet a lot of customers and love to hear their thoughts on the future of the end user computing world as much as they love to hear my digest of what’s going on in the marketplace. Discussion as of late has revolved around an emerging trend –modernizing one’s approach to PC Management – a subject I believe is partly driven by Windows 10 and the smartphone revolution.

We have all become accustomed to carrying miniature computers in our pockets that just happen to be able to make phone calls. We merrily add applications and upgrade its operating systems. Yet navigating the PC on our desk remains a logistics nightmare.

Why can’t our new computer be as logistically simple as getting a new phone?

Because consumerisation of IT and the expectations of a more tech-savvy workforce is at play.

My colleagues ask me to help them understand this trend and how it’s changing the way we assist our customers. If a user can acquire a new smartphone, sign into a cloud authentication service and download applications, why is it so difficult to deploy PCs?

To understand this, we must first address the history and market forces in the PC industry.

Let’s first establish shorthand terms:

PC means any device running the desktop version of Windows 10 irrespective of form factor (laptop, all-in-one, desktop or tablet). Devices running Windows Phone, Android or iOS will be referred to as smartphones.


It’s long been the practice of original equipment manufacturer (OEM) vendors to bundle Windows with every PC sold. This is a practice Microsoft has encouraged, ostensibly to save the purchaser from installing the operating system (OS), which has proven very helpful in establishing Microsoft as the dominant OS provider. Over time the practice of pre-installing software has been extended to include vendor tools and trial versions of software from third parties.

Whilst this is tolerable for the consumer market, the variability of the third-party software installed is intolerable to the enterprise market. To counter and reduce the total cost of ownership (TCO) of supporting their devices, IT departments have typically wiped devices clean and built them up from the bare metal in a process referred to as imaging. This process often takes 4-5 hours per device, adding cost and delaying the point at which the user takes control of the device.

Vendors, resellers and distributors, Dell among them, endeavored to make the PC provisioning and deployment process more efficient with centralised systems and developed a variety of capabilities, spawning a whole new market segment.

Traditional Imaging Options

Dell leads the market in providing imaging and configuration services for our customers who hold Microsoft Volume Licenses. These services are:

  1. Static – simple imaging using technologies such as Ghost, usually for a single device type based on the driver library included within the image.
  2. Dynamic – a development on the static imaging which enables device flexibility by dynamically including the driver library for different Dell hardware families.
  3. Connected Configuration – the modern imaging solution which works by extending the customer’s System Center Configuration Manager (SCCM) environment into a Dell build facility so devices are imaged in the same way as they would be at a customer site. This method provides significant customer benefit.

Once imaged, the users are typically managed via Active Directory authentication and the devices by SCCM or similar tooling.

Modern Provisioning

With Windows 10, Microsoft has enabled an entirely different approach to the problem. In building Windows 10 with a common code set shared across the PC and smartphone versions, the OMA-DM (Open Mobile Alliance-Device Management) specification is built into the base OS. For the first time, a Windows PC can be managed using tooling that was originally designed for smartphone management. This integration of mobile and Windows 10 PC management is referred to as unified endpoint management (UEM).

It should be noted that the OMA-DM specification is for the management and configuration or provisioning of the device, not imaging, and there are key differences between the approaches. Imaging allows deployment of the base OS, whereas provisioning assumes the base OS is already on the device and seeks to control it.

To encourage customers to take up Windows 10 more quickly, Microsoft strongly advocates the use of provisioning, as it is a lighter touch and provides a lower barrier to entry for the organisation. Equally, mobile device management (MDM) toolset vendors have aligned themselves with this narrative as it increases their addressable market. The key thrust of the approach is any Windows 10 device can be provisioned and managed, irrespective of vendor or acquisition route, including bring your own device (BYOD).

At the beginning of July, Microsoft launched Windows Autopilot which enables an end user to follow a simplified process to join a Windows 10 (1703) PC to Azure AD (Premium) and enroll it with the organisation’s chosen MDM tool. The Fall Creators Update for Windows 10 is expected to enable the same tool to allow end user to join their Windows 10 PC to an on-premises AD. In both cases, OEM pre-registration of the devices with Microsoft and Azure AD Premium licensing is required.

Enterprises are keen to consider options that reduce their TCO and makes it easier to roll out the new OS. As a result, they are looking to leverage the MDM tooling acquired to manage their smartphones to manage these devices.

However, there are key differences between managing devices via the traditional route and those provisioned in this way.

Scenario 1:

  • Workforce is largely office or with a fixed base of operations using a mix of PC form factors, typically connecting directly to a corporate network or via a VPN solution.
  • Applications tend to be more complex, drawing on local system resources or client-server with limited usage of external web or Software as a Service applications.
  • Organisational information assurance policies require systems are tightly managed to ensure compliance with patching and update policies.
  • Web access is channeled via IT provided proxy solutions to manage bandwidth and police content.
  • Data is stored on network files servers and email hosted internally.

This scenario is common within many organisations today and will be recognizable to most. It is typically based on Active Directory authentication and SCCM management. The devices are typically imaged.

Scenario 2:

  • Highly mobile workforce, typically using newer form factors.
  • Applications are either locally installed or accessed via a web browser (SaaS).
  • Email and office automation software are delivered via Office 365, Google Apps or similar toolset.
  • Data is stored in OneDrive for Business or Google Drive enterprise file sync and share solutions.
  • Applications are acquired via Windows Store for Business or the Mobile Application Management (MAM) capability of the MDM tooling.
  • There’s limited reliance on corporately provided applications. Identity management can require the user to manage numerous credential sets or require corporate investment in single sign on solutions (SSO).

This use case is becoming more common in sales environments and lends itself to the provisioning approach. It particularly appeals to customers who could go to any computer retailer and acquire a device that can then be provisioned, should their current device fail.

Equally, customers are looking at BYOD solutions to manage contingent labour. Here the customer feels that the contract day rate should include the contractor providing their own equipment which the corporate IT staff manage via the MDM toolset.

In reality, customers within their organizations do not split cleanly into Scenario 1 or 2 but a blend of the two.

What Does the Future Hold?

Most customers fall into predominantly Scenario 1 for historical reasons. As sales forces become more mobile and their applications shift from in-house-hosted to software-as-a-service, they will lean towards Scenario 2.

Figure 1: The Journey to Modern IT


However, there will still be a proportion of the workforce that does not require or cannot work in this Mobile First, Cloud First way as they are tied to incompatible applications or the cost of transforming the applications is too high. However, over time the number of users in this group is expected to fall.

As organisations transform their business processes and IT support to a more mobile device friendly approach, the balance of power will shift from the traditionally imaged to the modern provisioned. The speed of this transition will be determined by the ability for organisations to invest in this transformation. This may be by shifting to SaaS versions of their current applications or adopting application publishing solutions to enable access to internal applications from devices that sit outside of the corporate firewall.

The balance is firmly weighted in favour of the traditional imaging approach, but based on the number of organisations actively investigating modern provisioning, this will not last for long.

How Dell Can Help Our Customers

Dell EMC is well placed to address both the current and future markets. We have a very strong presence in the device imaging (Scenario 1) business and are trusted by our customers to deliver this service through our ProDeploy for Client Suite. We are seeing strong interest in the ProDeploy Plus business as customers look to optimise the traditional element of their estate. There is a gradual shift of customers from static to dynamic to Connected Configuration.

To address Scenario 2, we can leverage our strong relationship with Microsoft and use our Services capability to deliver solutions based around both the SCCM and Enterprise Mobility + Security (EM+S) suite which addresses the modern provisioning approach. The integration between SCCM and Intune is improving with time but they are fundamentally different offers that work together.

As interest in the Windows AutoPilot tooling grows, we are seeing significant interest in our ability as an OEM to pre-register our devices with Microsoft to enable that approach for modern provisioning. With the release of Fall Creators Update there will be the opportunity to create a hybrid approach whereby the Windows 10 device is delivered to a user without being imaged and can be joined by the end user to the domain via AutoPilot.

When VMWare joined Dell Technologies, it brought the Workspace One Solution which includes tooling to address the modern provisioning (Scenario 2) approach. The AirWatch component is a strong brand in the MDM market and many customers investigating Microsoft’s EM+S evaluate both AirWatch and Intune. Equally, the application publishing capabilities, which rely on VMWare’s Horizon product, are an alternative to the Azure RemoteApp capability that Microsoft has already discontinued in favor of Citrix XenApp Essentials.

Microsoft’s Azure Active Directory Premium (AADP) builds upon the capabilities of the Azure AD license, which many customers will have as part of their Office 365 migration, enabling the AADP identity to be used to access many third-party SaaS apps without requiring the IT team to build and manage a web of bilateral authentication arrangements.

We can assist you to design, build and implement your Modern Management Capability to meet the evolving needs of your increasingly mobile user community. In doing so, we will address the security, functionality and affordability challenges specific to your business, enabling you to give users the flexibility they demand without relinquishing control of the environment.

Let me know in the comments below if you see this trend emerging in your industry and geography. I look forward to hearing you and the way your organisation is addressing the consumerisation of IT.

Related Blogs:

Windows 10 Migration: Best Practices for Making a User’s First Impression Great
Windows 10 Migration: Should You Shift Left, or Right?

The post How to Modernize Your PC Management Approach appeared first on InFocus Blog | Dell EMC Services.

]]> 8