Matt Liebowitz – InFocus Blog | Dell EMC Services https://infocus.dellemc.com DELL EMC Global Services Blog Thu, 13 Dec 2018 11:38:05 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.7 Best Practices for Virtualizing Active Directory Domain Controllers (AD DC), Part II https://infocus.dellemc.com/matt-_liebowitz/best-practices-for-virtualizing-active-directory-domain-controllers-ad-dc-part-ii/ https://infocus.dellemc.com/matt-_liebowitz/best-practices-for-virtualizing-active-directory-domain-controllers-ad-dc-part-ii/#respond Mon, 15 Oct 2018 09:00:21 +0000 https://infocus.dellemc.com/?p=36098 Virtualized Active Directory is ready for Primetime, Part II! In the first of this two-part blog series, I discussed how virtualization-first is the new normal and fully supported; and elaborated on best practices for Active Directory availability, achieving integrity in virtual environments, and making AD confidential and tamper-proof. In this second installment, I’ll discuss the […]

The post Best Practices for Virtualizing Active Directory Domain Controllers (AD DC), Part II appeared first on InFocus Blog | Dell EMC Services.

]]>
Virtualized Active Directory is ready for Primetime, Part II!

In the first of this two-part blog series, I discussed how virtualization-first is the new normal and fully supported; and elaborated on best practices for Active Directory availability, achieving integrity in virtual environments, and making AD confidential and tamper-proof.

In this second installment, I’ll discuss the elements of time in Active Directory, touch on replication, latency and convergence; the preventing and mediating lingering objects, cloning and of much relevance, preparedness for Disaster Recovery.

Proper Time with Virtualized Active Directory Domain Controllers (AD DC)

Time in virtual machines can easily drift if they are not receiving constant and consistent time cycles. Windows operating systems keep time based on interrupt timers set by CPU clock cycles. In a VMware ESXi host with multiple virtual machines, CPU cycles are not allocated to idle virtual machines.

To plan for an Active Directory implementation, you must carefully consider the most effective way of providing accurate time to domain controllers and understand the relationship between the time source used by clients, member servers, and domain controllers.

The Domain Controller with the PDC Emulator role for the forest root domain ultimately becomes the “master” timeserver for the forest – the root time server for synchronizing the clocks of all Windows computers in the forest. You can configure the PDC to use an external source to set its time. By modifying the defaults of this domain controller’s role to synchronize with an alternative external stratum 1 time source, you can ensure that all other DCs and workstations within the domain are accurate.

Why Time Synchronization Is Important in Active Directory

Every domain-joined device is affected by time!

Ideally, all computer clocks in an AD DS domain are synchronized with the time of an authoritative computer. Many factors can affect time synchronization on a network. The following factors often affect the accuracy of synchronization in AD DS:

  • Network conditions
  • The accuracy of the computer’s hardware clock
  • The amount of CPU and network resources available to the Windows Time service

Prior to Windows Server 2016, the W32Time service was not designed to meet time-sensitive application needs. Updates to Windows Server 2016 allow you to implement a solution for 1ms accuracy in your domain.

Figure 1: How Time Synchronization Works in Virtualized Environments

See Microsoft’s How the Windows Time Service Works for more information.

How Synchronization Works in Virtualized Environments

An AD DS forest has a predetermined time synchronization hierarchy. The Windows Time service synchronizes time between computers within the hierarchy, with the most accurate reference clocks at the top. If more than one time source is configured on a computer, Windows Time uses NTP algorithms to select the best time source from the configured sources based on the computer’s ability to synchronize with that time source. The Windows Time service does not support network synchronization from broadcast or multicast peers.

Replication, Latency and Convergence

Eventually, changes must converge in a multi-master replication model…

The Active Directory database is replicated between domain controllers. The data replicated between controllers called ‘data’ are also called ‘naming context.’ Only the changes are replicated, once a domain controller has been established. Active Directory uses a multi-master model; changes can be made on any controller and the changes are sent to all other controllers. The replication path in Active Directory forms a ring which adds reliability to the replication.

Latency is the required time for all updates to be completed throughout all domain controllers on the network domain or forest.

Convergence is the state at which all domain controllers have the same replica contents of the Active Directory database.

Figure 2: How Active Directory Replication Works

For more information on Replication, Latency and Convergence, see Microsoft’s Detecting and Avoiding Replication Latency.”

Preventing and Remediating Lingering Objects

Don’t revert to snapshot or restore backups beyond the TSL.

Lingering objects are objects in Active Directory that have been created, replicated, deleted, and then garbage collected on at least the Domain Controller that originated the deletion but still exist as live objects on one or more DCs in the same forest. Lingering object removal has traditionally required lengthy cleanup sessions using various tools, such as the Lingering Objects Liquidator (LoL).

Dominant Causes of Lingering Objects

  1. Long-term replication failures

While knowledge of creates and modifies are persisted in Active Directory forever, replication partners must inbound replicate knowledge of deleted objects within a rolling Tombstone Lifetime (TSL) # of days (default 60 or 180 days depending on what OS version created your AD forest). For this reason, it’s important to keep your DCs online and replicating all partitions between all partners within a rolling TSL # of days. Tools like REPADMIN /SHOWREPL * /CSV, REPADMIN /REPLSUM and AD Replication Status should be used to continually identify and resolve replication errors in your AD forest.

  1. Time jumps

System time jump more than TSL # of days in the past or future can cause deleted objects to be prematurely garbage collected before all DCs have inbound replicated knowledge of all deletes. The protection against this is to ensure that:

  • The forest root PDC is continually configured with a reference time source (including following FSMO transfers).
  • All other DCs in the forest are configured to use NT5DS hierarchy.
  • Time rollback and roll-forward protection has been enabled via the maxnegphasecorrection and maxposphasecorrection registry settings or their policy-based equivalents.
  • The importance of configuring safeguards can’t be stressed enough.
  1. USN rollbacks

USN rollbacks are caused when the contents of an Active Directory database move back in time via an unsupported restore. Root causes for USN Rollbacks include:

  • Manually copying previous version of the database into place when the DC is offline.
  • P2V conversions in multi-domain forests.
  • Snapshot restores of physical and especially virtual DCs. For virtual environments, both the virtual host environment AND the underlying guest DCs should be compatible with VM Generation ID. Windows Server 2012 or later, and vSphere 5.0 Update 2 or later, support this feature.
  • Events, errors and symptoms that indicate you have lingering objects.

Figure 3: USN Rollbacks – How Snapshots Can Wreak Havoc on Active Directory

Cloning

You should always use a test environment before deploying the clones to your organization’s network.

DC Cloning enables fast, safer Domain Controller provisioning through clone operation.

When you create the first domain controller in your organization, you are also creating the first domain, the first forest, and the first site. It is the domain controller, through group policy, that manages the collection of resources, computers, and user accounts in your organization.

Active Directory Disaster Recovery Plan: It’s a Must

Build, test, and maintain an Active Directory Disaster Recovery Plan!

AD is indisputably one of an organization’s most critical pieces of software plumbing and in the event of a catastrophe – the loss of a domain or forest – its recovery is a monumental task. You can use Site Recovery to create a disaster recovery plan for Active Directory.

Microsoft Active Directory Disaster Recovery Plan is an extensive document; a set of high-level procedures and guidelines that must be extensively customized for your environment and serves as a vital point of reference when determining root cause and how to proceed with recovery with Microsoft Support.

Summary

There are several excellent reasons for virtualizing Windows Active Directory. The release of Windows Server 2012 and its virtualization-safe features and support for rapid domain controller deployment alleviates many of the legitimate concerns that administrators have about virtualizing AD DS. VMware® vSphere® and our recommended best practices also help achieve 100 percent virtualization of AD DS.

Please reach out to your Dell EMC representative or checkout Dell EMC Consulting Services to learn how we can help you with virtualizing AD DS or leave me a comment below and I’ll be happy to respond back to you.

Sources

Virtualizing a Windows Active Directory Domain Infrastructure

Related Blog

Best Practices for Virtualizing Active Directory Domain Controllers (AD DC), Part I

The post Best Practices for Virtualizing Active Directory Domain Controllers (AD DC), Part II appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/matt-_liebowitz/best-practices-for-virtualizing-active-directory-domain-controllers-ad-dc-part-ii/feed/ 0
Best Practices for Virtualizing Active Directory Domain Controllers (AD DC), Part I https://infocus.dellemc.com/matt-_liebowitz/best-practices-virtualizing-active-directory-domain-controllers-ad-dc-part-i/ https://infocus.dellemc.com/matt-_liebowitz/best-practices-virtualizing-active-directory-domain-controllers-ad-dc-part-i/#respond Mon, 17 Sep 2018 09:00:43 +0000 https://infocus.dellemc.com/?p=36080 Virtualized Active Directory is ready for Primetime! In today’s technology climate, monitoring for changes should be part of the organization’s security culture. Your IT team knows the importance of securing the network against data breaches from external threats, however, data breaches from inside the organization represent nearly 70% of all data leaks[1]. Are you doing […]

The post Best Practices for Virtualizing Active Directory Domain Controllers (AD DC), Part I appeared first on InFocus Blog | Dell EMC Services.

]]>
Virtualized Active Directory is ready for Primetime!

In today’s technology climate, monitoring for changes should be part of the organization’s security culture. Your IT team knows the importance of securing the network against data breaches from external threats, however, data breaches from inside the organization represent nearly 70% of all data leaks[1].

Are you doing enough to prevent the data leaks? Enter Active Directory Domain Services (AD DS).

“Virtualize-First” Is the New Normal

Reasons to virtualize Active Directory Domain Controllers.

As the prominent directory service and authentication store, AD DS comprises the majority of network infrastructures, a business critical application (BCA). It provides the methods for storing directory data and making this data available to network users and administrators, thereby storing information about user accounts – names, passwords, phone numbers, etc. – and enables other authorized users on the same network to access this information.

In much the same way that the criticality of AD DS differs in organizations, so does the acceptance of virtualizing this service. More conservative organizations choose to virtualize a portion of the AD DS environment and retain a portion on physical hardware. This proclivity stems from the complexity of timekeeping in virtual machines, deviation from current build processes or standards, the ability to keep an AD Flexible Single Master Operations (FSMO) role physical, privilege escalation, and fear of a stolen .vmdk.

Figure 1: Common Objections to Domain Controller Virtualization

But fear not!

The release of Windows Server 2012 (and Windows Server 2016) and its virtualization-safe features and support for rapid domain controller deployment alleviates many of the legitimate concerns that administrators have about virtualizing AD DS. VMware® vSphere® and our recommended best practices also help achieve 100 percent virtualization of AD DS.

Best Practices for Active Directory (AD) Availability

Active Directory is the cornerstone to every environment   when Active Directory comes to a halt, everything connected does too.

Since many domain controller virtual machines may be running on a single VMware ESXI host, eliminating single points of failure and providing a high-availability solution will ensure rapid recovery. VMware provides solutions for automatically restarting virtual machines. If a VMware ESXi goes down, VMware High Availability (HA) can automatically restart a domain controller virtual machine on one of the remaining hosts, preventing loss of Active Directory. Using configuration options, you can prioritize the restart or isolation status for individual virtual machines. VMware also allows you to specify a priority for restarting virtual machines. For example, it is important for domain controllers functioning as global catalog servers to be online before your Exchange Server environment initializes. It is always a best practice to set your domain controller virtual machines as high-priority servers.

Additionally, you can implement a script to restart a virtual machine via a loss-of-heartbeat alarm through vCenter. You can accomplish this using a script (available with the VI Perl Toolkit or the VMware Infrastructure SDK 2.0.1) and combined with VMware Distributed Resource Scheduler (DRS), ensure that domain controllers from the same domain always reside on different VMware ESXi hosts to prevent placing all the domain controllers in one basket. The anti-affinity rules let you specify which domain controllers must stay together and which must be separated.

For guidance, follow Microsoft Operations Master Role Placement Best Practices or Dell EMC’s recommended practices.

Achieving Active Directory (AD) Integrity in Virtual Environments

Performing consistent system state backups eliminates hardware incapability when performing a restore, and ensures the integrity of the Active Directory database by committing transactions and updating database IDs. 

For success in implementing Active Directory in the virtual environment, you must ensure a successful migration from the physical environment to the virtual environment. Since Active Directory is heavily dependent on a transaction-based datastore, you must guarantee integrity by making sure there is a solid, reliable means of providing accurate time services to the PDC Emulator and other domain controllers throughout the Active Directory forest.

Network performance is another key to success in a virtual Active Directory implementation, since slow or unreliable network connections can make authentication difficult. Modifying DNS weight and priority to reduce load on the primary domain controller assists can help improve network performance. Because Active Directory depends on reliable replication, ensure continuity by using replmon to monitor it. Also, continue regular system state backups, and always restore from a system state backup. Virtual machines make it easy to move domain controllers; use VMware High Availability (HA) and VMware Distributed Resource Scheduler (DRS) so that no critical domain controllers are on a single host.

Practice the art of disaster recovery regularly. Finally, always go back and re-evaluate your strategies; monitor results for improvements and make adjustments when necessary.

Making Active Directory Confidential and Tamper-proof

Assessments in organizations that have experienced catastrophic or compromised events usually reveal they have limited visibility into the actual state of their IT infrastructures, which may differ significantly from their “as documented” states. These variances introduce vulnerabilities that expose the environment to compromise, often with little risk of discovery until the compromise has progressed to the point at which the attackers effectively “own” the environment.

Detailed assessments of these organizations’ AD DS configuration, public key infrastructures (PKIs), servers, workstations, applications, access control lists (ACLs), and other technologies reveal gaps in administrative practices, misconfigurations and vulnerabilities that, if remediated, could have prevented compromise and in extreme cases, prevented attackers from establishing a foothold in the AD DS environment.

See Microsoft’s Monitoring Active Directory for Signs of Compromise for further insights.

Figure 2: 4 tips for General Practices for Active Directory Confidentiality

Summary

There are several excellent reasons for virtualizing Windows Active Directory. Virtualization offers the advantages of hardware consolidation, total cost of ownership reduction, physical machine lifecycle management, mobility and affordable disaster recovery and business continuity solutions. It also provides a convenient environment for test and development, as well as isolation and security.

Stay tuned for part II of this blog series where I’ll address proper time and synchronization with virtualized AD DC, replication, latency and convergence; preventing and remediating lingering objects, cloning, and disaster recovery.

Please reach out to your Dell EMC representative or checkout Dell EMC Consulting Services to learn how we can help you with virtualizing AD DS or leave me a comment below and I’ll be happy to respond back to you.

Sources

Virtualizing a Windows Active Directory Domain Infrastructure

Microsoft’s Avenues to Compromise

[1] Statista.com Data Breaches Recorded in the U.S. by Number of Breaches and Records Exposed

Related Blog

Best Practices for Virtualizing Active Directory Domain Controllers (AD DC), Part II

The post Best Practices for Virtualizing Active Directory Domain Controllers (AD DC), Part I appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/matt-_liebowitz/best-practices-virtualizing-active-directory-domain-controllers-ad-dc-part-i/feed/ 0
Multicloud is the New Reality https://infocus.dellemc.com/matt-_liebowitz/multi-cloud-is-the-new-reality/ https://infocus.dellemc.com/matt-_liebowitz/multi-cloud-is-the-new-reality/#respond Mon, 23 Apr 2018 08:55:21 +0000 https://infocus.dellemc.com/?p=34994 In my many years in the IT industry I’ve seen many new industry buzzwords come out and then immediately become adopted by everyone. Words like virtualization and cloud were used by everyone and vendors would rush to say their products were “cloud ready” or “optimized for cloud” to capture that excitement. Suddenly everyone’s virtualized environment […]

The post Multicloud is the New Reality appeared first on InFocus Blog | Dell EMC Services.

]]>
In my many years in the IT industry I’ve seen many new industry buzzwords come out and then immediately become adopted by everyone. Words like virtualization and cloud were used by everyone and vendors would rush to say their products were “cloud ready” or “optimized for cloud” to capture that excitement. Suddenly everyone’s virtualized environment became a cloud even if they were called a virtual infrastructure just the week prior. We’re seeing more of that today with new buzzwords like blockchain, IoT, and others. In my world the new hotness is “multicloud.”

It’s true – hybrid cloud has become old and busted and the new hotness is multicloud (well done, if you got my 90s movie reference). In many cases folks conflate the terms hybrid cloud and multicloud, thinking they actually mean the same thing. The truth is that both hybrid cloud and multicloud are separate terms and both are equally important to an organization’s IT strategy.

Multicloud may be a new buzzword but where there’s smoke there’s fire. In the RightScale 2018 State of the Cloud Report, they found that 81% of organizations have a multicloud strategy. That shows organizations are taking multicloud seriously and seeing that adopting a cloud strategy that looks holistically across clouds is the future. Perhaps more importantly, the report found that organizations are already using five clouds today.

If an organization uses 5 clouds, does that mean they’ve adopted a multicloud strategy? Is it as simple as using multiple clouds for your infrastructure and applications?

As with most things in IT, and in life, it isn’t quite that simple. If you simply use multiple clouds for different purposes without tying them together then you’ve likely just created new silos that increase management costs and introduce risk.

In order to properly tie multiple clouds together you need to consider a few elements.

  • Embrace a cloud-first operating model
  • Control your destiny
  • Adopt an actionable strategy

When organizations embrace a cloud-first operating model, they can move more quickly to implement new ideas, lower overall complexity and risk, and create systems that are transparent and efficient. The people and process portion of multicloud is absolutely critical to success as you solve this with technology alone. Having an operating model that allows for DevOps, cost visibility of workloads, and uses a service management framework is absolutely necessary to success in multicloud.

Next, organizations need to control their own destiny in choosing the cloud infrastructure that supports their goals and business objectives. The cloud infrastructure organization’s choice needs to be able to be deployed quickly, be integrated across compute/storage/networking, and should support cloud access. The right cloud infrastructure can be used to create single interfaces that allow managing and provisioning of resources in a hybrid cloud model (see – hybrid cloud isn’t so old and busted after all).  Tools can be used to centralize cloud access, perform cost analysis, and simplify cloud consumption.

Finally, organizations need to adopt an actionable strategy that is aligned to their desired business outcomes. This strategy needs to consider:

  • The infrastructure that will be used to support their cloud initiatives
  • The applications that will either be moved to the new cloud platforms or ultimately retired or refactored into cloud-native applications
  • The operating model, tightly integrated with the business, brings this all together

The post Multicloud is the New Reality appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/matt-_liebowitz/multi-cloud-is-the-new-reality/feed/ 0
Delivering Modern Applications with Azure Stack https://infocus.dellemc.com/matt-_liebowitz/delivering-modern-applications-with-azure-stack/ https://infocus.dellemc.com/matt-_liebowitz/delivering-modern-applications-with-azure-stack/#respond Sun, 25 Mar 2018 16:21:18 +0000 https://infocus.dellemc.com/?p=34571 Accelerate Your Digital Transformation with Dell EMC Cloud for Microsoft Azure Stack Many vendors in the cloud world are trying to approach the challenge of providing an easy way to deploy applications on-premises in a private cloud as well as off-premises into a public cloud. Microsoft’s approach is to provide a common interface, development framework, […]

The post Delivering Modern Applications with Azure Stack appeared first on InFocus Blog | Dell EMC Services.

]]>
Accelerate Your Digital Transformation with Dell EMC Cloud for Microsoft Azure Stack

Many vendors in the cloud world are trying to approach the challenge of providing an easy way to deploy applications on-premises in a private cloud as well as off-premises into a public cloud. Microsoft’s approach is to provide a common interface, development framework, and automation engine to deploy applications either in the public cloud with Azure or in the private, on-premises cloud with their Azure Stack solution. The industry response, unsurprisingly, has been largely positive.

Being able to develop an application once and deploy it anywhere without needing to modify or re-work it is a key value proposition of Azure Stack. This capability makes Azure Stack popular not only for customers who already are heavy consumers of Azure public but also service providers who are looking to offer Azure services to their customers.

Dell EMC Cloud for Microsoft Azure Stack solution enables customers to run Azure Stack software on Dell EMC hardware

Azure Stack can provide an ‘easy button’ for those that already use Microsoft Azure and the many cloud services that it provides. Organizations don’t need to train their developers on new tools or IT administrators/operators on a new system they have to manage and maintain. Azure Stack provides agility to organizations that are looking to deploy new applications quickly and consistently across their organization and to their customers.

Here at Dell EMC, we offer our Dell EMC Cloud for Microsoft Azure Stack solution to enable our customers to run the powerful Azure Stack software on proven Dell EMC hardware. Our solution also goes beyond simply providing a hardware platform on which to run the Azure Stack software. We enable our customers to provide a complete cloud solution to their customers or end users.

To do this, we:

  • Deliver a hardware platform that is fully tested and integrated with the Azure Stack software to provide a fully functional, supported, and trusted cloud platform.
  • Integrate our own industry leading solutions for things like data protection and security directly into the solution to provide additional functionality to meet our customer’s business objectives.
  • Provide our experienced Consulting team who can customize the solution for our customers.

Dell EMC and Microsoft are holding a series of events to help our customers better understand the capabilities of Azure Stack and start thinking about how they may adopt it in their own environments. These events will cover an overview of the solution, potential use cases for the Azure Stack platform, and then a live demo of the Azure Stack solution. Our Consulting team (myself included) will be at these events talking about how we can help customers deploy their modern applications on a fully automated and integrated Azure Stack cloud.

I truly hope you can make it out to one of these upcoming events to see the power of Azure Stack and what it can bring to your organization.


Your Exclusive In-Person Invitation to Learn More about Dell EMC Cloud for Microsoft Azure in a City Near You

Extend your investment in Azure to deliver consistent end-user experiences wherever the data and applications reside. Dell EMC Cloud for Microsoft Azure Stack allows you to experience true application and workload portability — on both the public Azure cloud and within the data center.

Join solutions experts and technology leaders from Microsoft and Dell EMC to learn how, with a fully-integrated Azure Stack platform, you can:

  • Improve delivery time for new applications and services with a turnkey infrastructure platform.
  • Become an IT-as a-service broker, elevating IT importance to the business.
  • Meet the demands of regulatory compliance and customer data privacy.

You will also participate in an Azure Stack demonstration and an interactive discussion about use cases. We look forward to meeting with you!

Click the links below to register now!

Thursday, April 5th (Santa Clara, CA)

Tuesday, April 24th (Denver, CO)

Click here for the Solution Overview of Dell EMC Cloud for Microsoft Azure Stack.

 

The post Delivering Modern Applications with Azure Stack appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/matt-_liebowitz/delivering-modern-applications-with-azure-stack/feed/ 0
Virtualize Active Directory, the Right Way! https://infocus.dellemc.com/matt-_liebowitz/virtualize-active-directory-right-way/ https://infocus.dellemc.com/matt-_liebowitz/virtualize-active-directory-right-way/#respond Thu, 17 Aug 2017 11:35:07 +0000 https://infocus.dellemc.com/?p=32134 Virtualizing Microsoft Active Directory domain controllers, and business critical applications in general, is near and dear to my heart.  I firmly believe that there are almost no applications left that can’t be virtualized, and this session gives me an opportunity to share my experiences and help others become successful. Business critical applications have become, for […]

The post Virtualize Active Directory, the Right Way! appeared first on InFocus Blog | Dell EMC Services.

]]>
Virtualizing Microsoft Active Directory domain controllers, and business critical applications in general, is near and dear to my heart.  I firmly believe that there are almost no applications left that can’t be virtualized, and this session gives me an opportunity to share my experiences and help others become successful. Business critical applications have become, for the most part, the last applications and servers that are still physical for many organizations. Getting as to close to 100% virtualization as possible is an important goal to strive for.

Why is that important? Another firmly held belief of mine is that virtualization is truly the on-ramp to the cloud. By virtualizing even your organization’s most important workloads, you take one step closer to a future state where you can start taking advantage of cloud computing in your organization.

Of course, simply having a virtual infrastructure doesn’t mean you have a cloud. Having a true hybrid cloud involves additional components to facilitate automation, orchestration and to provide users with that service catalog where they can consume IT resources on a self-service basis. Virtualizing your organization’s servers makes it easier to start layering in those cloud components, and once in place you’ll want even your business critical servers virtualized, so you can start taking advantage of what a true hybrid cloud has to offer.

It’s that time again – the annual VMworld conferences.  This is my 13th VMworld!

Twitter Image - VMworld Realize ThemeThis year I’m presenting a session called, “Virtualizing Active Directory: The Right Way!” on Tuesday, Aug 29, 4:00 p.m. – 5:00 p.m.  It was a top 10 session last year, so if you’re at the conference, come by early to get a good seat.  Bring your copy of Virtualizing Microsoft Business Critical Applications on VMware vSphere or VMware vSphere: Performance and I’ll be happy to sign it.  Let me (@mattliebowitz) know what you think of the session, the book or the conference.
If you’re walking down the halls at VMworld and happen to see someone who looks like former VMware CEO Paul Maritz, stop him and say hi.  It’s probably me!

The post Virtualize Active Directory, the Right Way! appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/matt-_liebowitz/virtualize-active-directory-right-way/feed/ 0
Does Enterprise Hybrid Cloud Fulfill the Promise of “True” Hybrid Cloud? https://infocus.dellemc.com/matt-_liebowitz/enterprise-hybrid-cloud-fulfill-the-promise-of-true-hybrid-cloud/ https://infocus.dellemc.com/matt-_liebowitz/enterprise-hybrid-cloud-fulfill-the-promise-of-true-hybrid-cloud/#comments Mon, 10 Oct 2016 12:00:27 +0000 https://infocus.dellemc.com/?p=28886 Late last year I read a great article from Wikibon called “True” Private Cloud will begin shipping to the market in 2016. I really liked how their definition of private cloud matched up with the capabilities and structure of our own Enterprise Hybrid Cloud. As I sit here on this long flight from New Jersey […]

The post Does Enterprise Hybrid Cloud Fulfill the Promise of “True” Hybrid Cloud? appeared first on InFocus Blog | Dell EMC Services.

]]>
Late last year I read a great article from Wikibon called “True” Private Cloud will begin shipping to the market in 2016. I really liked how their definition of private cloud matched up with the capabilities and structure of our own Enterprise Hybrid Cloud. As I sit here on this long flight from New Jersey to Las Vegas for VMworld 2016, I decided to revisit that article and see how well it’s stood up in 2016 and if our Enterprise Hybrid Cloud really meets their definition of True Private Cloud. And, more importantly, talk about why it’s important to have hybrid as part of your cloud strategy.

Comparing True Private Cloud to Enterprise Hybrid Cloud

To start off, let’s look at how Wikibon defines True Private Cloud and how it compares to Enterprise Hybrid Cloud.

Converged infrastructure

“Built with a foundation of converged (or hyperconverged) infrastructure, that can be highly automated and managed as logical pools of compute, network and storage resources.”

Since the release of Enterprise Hybrid Cloud in 2014, our company has supported converged solutions like Vblock and VxBlock as the platform of choice. We’ve also supported a “bring your own” model where a customer can choose their own hardware and, provided it meets the requirements, our services team helps the customer convert it to Enterprise Hybrid Cloud.

Despite that, the vast majority of our customers have gone down the route of converged infrastructure. Why? Customers get it. They know that converged infrastructure is the fastest path to success, simplifying the architecture while providing a powerful and supported combination of industry leading technologies.

Self-service

“Enables end users (developers, line-of-business, etc.) to have self-service access to resource pools and have visibility to internal costs or IT chargeback pricing.”

It’s true: you can provide powerful hardware to run your cloud. But the truth is, if IT consumers can’t easily get access to your cloud solution, they’re going to find love in the arms of another cloud. A true private/hybrid cloud needs to provide that same self-service provisioning and cost visibility that public clouds provide.

The Enterprise Hybrid Cloud leverages the power of VMware’s vRealize Suite to provide powerful self-service capabilities and cost visibility back to the business.  That suite of software gives them a powerful self-service catalog and orchestration engine, a tool to monitor performance in the environment, and cost visibility for the resources consumed.  Combined with the extensive engineering that went into creating Enterprise Hybrid Cloud and it provides customers with a very functional cloud solution.

One-stop shopping for support

“A single point of purchase, support, maintenance, and upgrade for a pre-tested and fully maintained complete solution (a single throat to choke).”

As a technologist it’s often easy to get caught up in the “speeds and feeds” of a cloud solution. While that may be technically interesting, the thing that CIOs care about is driving business value from IT. Creating a cloud from scratch is a daunting task for customers and the fact that Enterprise Hybrid Cloud has been created with thousands of hours of engineering effort makes it a very compelling solution. Customers know when they unwrap their Enterprise Hybrid Cloud it’s not an “assembly required” platform. They know it’ll be delivered quickly and be ready to go “out of the box,” quickly driving business value right away instead of months in the future.  Again, customers get it.

There are other pieces of Wikibon’s definition of True Private Cloud that I encourage you to read, but you might be wondering why I’m talking about Enterprise Hybrid Cloud in the context of private cloud. Maybe I’m hopeful next year Wikibon will change their definition to True Hybrid Cloud?

The key to a successful hybrid cloud implementation

The fact is customers need to adopt a solution that has both private cloud capabilities and public cloud capabilities. The key to making the hybrid model successful is to use a platform that provides hybrid functionality along with private. If IT tells its developers to go to one tool for on-premises and another tool for off-premises it’s likely to end badly.

Most developers or end users don’t care where their workload is provisioned. They care about things like performance characteristics, capabilities, and cost (to name a few). Making all of this visible for both public and private clouds all from the same interface allows the consumer of cloud resources to make the decision based on the needs of the business and not the limitations of the technology. We know customers want this, and we listen to our customers.

Enterprise Hybrid Cloud supports “out of the box” integration with public cloud provides like VMware vCloud Air and Amazon Web Services. In the future we’ll see even more public clouds supported, providing customers with the choices they need to enable them to make the decisions that are right for their business.

In closing, I think the Wikibon article does a great job of defining not only private cloud but also hybrid cloud. And I’m also happy to see that Enterprise Hybrid Cloud “checks the boxes” of private cloud while also providing hybrid capabilities that our customers are asking for.

The post Does Enterprise Hybrid Cloud Fulfill the Promise of “True” Hybrid Cloud? appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/matt-_liebowitz/enterprise-hybrid-cloud-fulfill-the-promise-of-true-hybrid-cloud/feed/ 1
Tips for Unlocking Business Value with Cloud https://infocus.dellemc.com/matt-_liebowitz/tips-unlocking-business-value-cloud/ https://infocus.dellemc.com/matt-_liebowitz/tips-unlocking-business-value-cloud/#respond Mon, 03 Oct 2016 11:00:43 +0000 https://infocus.dellemc.com/?p=28889 As I’ve talked about both with customers and in previous blog posts, cloud needs to drive business value. CIOs are not interested in deploying cloud because they read a blog post about it or because Gartner says they should. Ultimately they understand that the world of technology is changing and people are increasingly expecting a […]

The post Tips for Unlocking Business Value with Cloud appeared first on InFocus Blog | Dell EMC Services.

]]>
As I’ve talked about both with customers and in previous blog posts, cloud needs to drive business value. CIOs are not interested in deploying cloud because they read a blog post about it or because Gartner says they should.

Ultimately they understand that the world of technology is changing and people are increasingly expecting a self-service model in everything they do. This is true whether they’re downloading an application on their smartphone, calling for a car, or provisioning IT resources. CIOs need to adopt this model (IT as a Service) to help bring value to the business and drive the necessary outcomes of the business.

What does it actually mean to drive business value? It sounds really good to say it and people think you’re smart, but obviously there’s more to it than that. Let’s look at some examples of how cloud drives business value for customers.

It’s all about the applications

For those of us in technology we sometimes spend a little too much time thinking about the hardware in our solutions. I’ll admit, I’m guilty of it, too. When a new smartphone is being released I’m always interested in how much RAM it has and how many CPU cores it has—as if I’m going to run virtual machines on it (I totally would, if I could, but that’s beside the point). When you think about it, what good is the extra RAM or CPU power in a smartphone (or cloud) if it doesn’t run the software you need. It all comes down to the applications.

One way to extract business value out of cloud is a simple “lift and shift” of your application workloads into the cloud. The inherent capabilities of a cloud like Enterprise Hybrid Cloud, including self-service management, cost visibility, and backup as a service brings capabilities that were likely not there previously. That does bring some value, but dropping an application into a cloud doesn’t typically provide enhanced automation and orchestration at the application level. Making the application owners more nimble and providing capabilities beyond what were available prior to moving applications to the cloud is when you really start driving value.

Enterprise application blueprints drive value

EMC has invested thousands of hours of engineering the Enterprise Hybrid Cloud platform, creating integration with Dell EMC products and providing lots of great functionality. One area where significant engineering effort was spent was in creating a set of application blueprints for common enterprise applications. These include Microsoft Exchange Server, SQL Server, and the Oracle database platform just to name a few.

We hear from customers all the time that their teams want database as a service (DBaaS), allowing developers to more quickly provision and manage databases for the applications they’re writing. The Engineered Blueprints for Microsoft SQL Server opens the door to DBaaS by allowing our Dell EMC Services team to drop in a set of fully engineered blueprints for SQL Server that provides DBaaS capabilities. For example, these blueprints allow provisioning and de-provisioning of individual databases, database instances, or even entire database servers. End users scan also backup and restore databases on demand, freeing them to work more quickly and not have to wait for IT or DBAs to perform many of these functions for them.

It doesn’t just stop at SQL Server. These Engineered Blueprints can provide functionality for Exchange Server, like email as a service, automated provisioning of highly available email infrastructures, and backup/recovery on demand. Similar functionality is available for SharePoint Server, Oracle, and SAP applications, too.

By giving application users and developers access to capabilities they didn’t have before, they become more agile and efficient. When businesses can create the applications or enterprise systems they need to compete and to provide products and services to their customers, that’s when real value is unlocked.

Give the people what they want

The right tool for the job is important—whether you’re building a house, repairing a car, or working with an organization’s application portfolio. Most organizations today have a mix of “off-the-shelf” applications, “home-grown” applications, and more modern applications built in cloud-native frameworks. One tool or technology is not necessarily right for all of those workloads as their needs and requirements are different. We believe in providing choice to our customers, and that’s no different here.

Enterprise Hybrid Cloud is a fantastic platform for the off-the-shelf enterprise applications from vendors like Microsoft, Oracle, and others. It’s built from the ground up for this class of application. As I described it above, Enterprise Hybrid Cloud has the capabilities to provide real business value. Dell EMC also has another cloud solution called Native Hybrid Cloud for those customers that are writing the next generation of cloud-native applications. Native Hybrid Cloud provides a turnkey cloud platform leveraging a scale-out architecture and platform integration with Pivotal Cloud Foundry to give developers a platform that accelerates their creation of the next generation of applications.

If customers try to cram a square peg into a round hole and use the wrong platform for the job, it becomes much more difficult to unlock the value that the business needs. Both Native Hybrid Cloud and Enterprise Hybrid Cloud are designed to be delivered quickly from our Dell EMC Services team in order to enable our customers to quickly see the value of their investments.

Listen and learn

One of the most important part of any of our cloud projects is talking to our customers to understand their goals, business objectives, and outcomes they’re trying to achieve. It sounds obvious but it’s true – our goal is not to simply drop off a cloud solution in a “one size fits all” manner. We sit down with our customers to understand where they’re going, and then, using our cloud solutions as a foundation, work together to craft an architecture that meets their goals. It’s very consultative and is outcome-focused.

We want to help our customers achieve their goals and build a lasting relationship. We wouldn’t be successful if we approached cloud as a single solution for everyone, as not all organizations measure the business value derived from their IT investments in the same way. We listen, learn and adapt based on the requirements of all of our customers.

We’re in this together with our customers in marching towards a “cloudy” future. Our goal is to provide solutions that help our customers solve their business problems. It’s an exciting time to be part of our Dell EMC Services team!

The post Tips for Unlocking Business Value with Cloud appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/matt-_liebowitz/tips-unlocking-business-value-cloud/feed/ 0
From Factory Automation to Cloud Automation https://infocus.dellemc.com/matt-_liebowitz/from-factory-automation-to-cloud-automation/ https://infocus.dellemc.com/matt-_liebowitz/from-factory-automation-to-cloud-automation/#respond Mon, 22 Aug 2016 12:47:16 +0000 https://infocus.dellemc.com/?p=28636 My six-year-old son loves the show How It’s Made on the Science Channel. There are usually many episodes back-to-back on Sunday mornings, and he’ll be up at 7 a.m. or earlier ready to watch. I’ll usually sit down and watch the episodes with him and am fascinated by the automation that goes into creating some […]

The post From Factory Automation to Cloud Automation appeared first on InFocus Blog | Dell EMC Services.

]]>
My six-year-old son loves the show How It’s Made on the Science Channel. There are usually many episodes back-to-back on Sunday mornings, and he’ll be up at 7 a.m. or earlier ready to watch. I’ll usually sit down and watch the episodes with him and am fascinated by the automation that goes into creating some of the things we use in our everyday lives.

As I thought more about it, I thought about how it’s not that different from when we work with our customers to adopt cloud computing. These customers need to look at their operational procedures, processes, and how they run their business and begin to identify areas where automation can bring about real savings. After all, cloud is not very useful unless you can automate your IT processes and then offer it out as a service to users and customers. Let’s take a look at some of the lessons we can learn from these big factories that automate their assembly lines to create the products we use.

How did they figure that out?

One thing that I always end up saying to myself while watching the show is something along the lines of, “How did they figure out all of these complex procedures?” In other words, how do they know that the metal they’re working with needs to go into an oven that cooks at a specific temperature for a set period of time in order to harden it properly? How do they know exactly how much ice cream to portion out for each ice cream sandwich?

The answer, in all cases, is that the company who designed the product fully understands what is involved in the process of creating the product. It sounds pretty simple and obvious, but unfortunately many IT departments don’t follow this same logic when they approach automation. IT departments understand they need to automate the process, but in their rush to do so they don’t fully understand all of the implications.

A relatively basic example of this would be the deployment of application workloads. Before cloud and automation, the requestor would typically request the server from IT. Now that most workloads are virtualized, it’s relatively easy for IT to create a new virtual machine. And they can often complete this in a matter of hours or days. In some case they’ll log the entry into a CMDB or update a ticket in an ITSM system and then hand the server off to the requestor.

In that scenario, IT may not have any sense of what happens after the server is deployed. Does the software being installed have any special licensing considerations? What is the expected lifecycle of the server? Does it need to integrate into any other systems? Simply put, if they don’t have the answers to these (and likely other) questions, how can IT be expected to properly automate the deployment of that application? IT needs to work with application owners, developers, and other stakeholders, in order to fully understand what is required before trying to automate the application workloads. By working together with the people who will be using the application they can properly automate it and bring real value to the business.

They use the right tool for the job

In the factory some of the tools used to create products are custom-made, and others can be repurposed. The same factory that makes packaged turkey can likely also create other packaged foods due to the similarities in requirements for automating those processes. Likewise, in the world of cloud, there are many different tools available to automate functions and choosing the right one is crucial to making the most of your investment.

When we work with customers deploying Enterprise Hybrid Cloud, we spend a lot of time up-front understanding the customer’s current state. What tools do they have in place? What skills does the customer’s IT team have, and what technologies can they support? Gathering this information helps us recommend the best solution for each customer. After all, why write a script for something when an existing tool might already be available?

Our customers are often already using tools like Puppet or Chef that can provide key functionality for cloud automation and orchestration. For integrating with third-party systems, there may be existing plug-ins for tools like vRealize Orchestrator that provide this functionality. And, of course, there are other systems that require custom-written scripts to properly automate functions.

When picking the right tool for the job, organizations need to consider many factors. We help them figure that out up-front so they can see real, tangible benefits and savings with Enterprise Hybrid Cloud.

Automate everything?

Occasionally How It’s Made will show certain items being created by hand. In many cases this is due to the precision required for what they’re making. There may be another important reason that the show leaves out: scale.

Just because something can be automated doesn’t necessarily mean it can scale. And, more importantly, just because something can be automated doesn’t mean that it should be automated. It may cost more money, time, and effort to create the automation than the benefit customers and end users will realize. The big factories know and understand this, and IT needs to as well. Fully understanding everything that goes into automating your processes, or making ice cream sandwiches, will allow organizations to get the most benefit.

Enterprise Hybrid Cloud brings real value to customers by helping them package up and deliver IT as a service, and automation is a key element of that. By fully understanding what needs to be done to properly automate something—knowing what tools you have available at your disposal and making decisions around the value of automating processes—organizations can derive real value from their cloud investments.

The post From Factory Automation to Cloud Automation appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/matt-_liebowitz/from-factory-automation-to-cloud-automation/feed/ 0
How TiVo Taught My Wife About Hybrid Cloud https://infocus.dellemc.com/matt-_liebowitz/how-tivo-taught-my-wife-about-hybrid-cloud/ https://infocus.dellemc.com/matt-_liebowitz/how-tivo-taught-my-wife-about-hybrid-cloud/#respond Mon, 30 Mar 2015 15:45:12 +0000 https://infocus.dellemc.com/?p=23037 There’s a famous quote that says, “If you can’t explain it simply, you don’t understand it well enough.” That statement is true. But I’m sure many of us have examples of where we’re challenged to explain something we know very well in a simple manner. Take the example of working in technology and explaining what […]

The post How TiVo Taught My Wife About Hybrid Cloud appeared first on InFocus Blog | Dell EMC Services.

]]>
There’s a famous quote that says, “If you can’t explain it simply, you don’t understand it well enough.” That statement is true. But I’m sure many of us have examples of where we’re challenged to explain something we know very well in a simple manner.

Take the example of working in technology and explaining what you do to family or friends who are in completely different fields. Can you easily explain virtualization, for example, to your car mechanic? What about explaining the concepts of hybrid cloud to an airline pilot (who has an altogether different view of clouds)?

Usually, the easiest way to explain a complex subject to someone who has never been exposed to it is to put it into terms of examples that they already understand. And so it was that I found myself talking with my wife about the benefits of hybrid cloud in terms of our TiVo DVR.

For years we’ve used TiVo to record shows for ourselves and our kids. TiVo has had a feature called Season Pass that automatically records all episodes of a TV show when they air on a particular channel. Recently, TiVo upgraded the Season Pass into a new feature called OnePass that brings together content on your TiVo with content available via streaming services like Netflix and Amazon.

We were recently at a party where someone recommended we watch a show we had never heard of. When we got home we decided to record it, and we got to see how the OnePass feature worked with this new show. The feature would allow us to record all new episodes of the show directly on our TiVo while also allowing us to use streaming services to catch up on the previous season that we haven’t seen. It does this all from the same familiar interface, grouping the newly recorded episodes and those episodes available from streaming services together into the same view.

2015-03-25 09 55 37You can see an example of what this looks like using my son’s favorite show Chuggington. From the same view, I can choose to stream episodes from multiple sources (in this case Amazon and Netflix, as shown in the lower right) indicated by the three little blue lines next to the name of the show. Or I can watch episodes I already recorded on my TiVo as indicated by the green dot next to each episode. It works great—I get all of the content that I want in one interface that I’m already familiar with.

Lest you think this is just a blog post advertising TiVo, let me bring it back to hybrid cloud. Those of you already familiar with the Federation Enterprise Hybrid Cloud probably already see the connection here. What TiVo has done with the OnePass feature is provide choice, allowing us to record shows directly on our TiVo (on-premises infrastructure) while also letting us consume content from streaming services (off-premises cloud) all from the same interface. We could make the decision to use paid services like Netflix to catch up on old episodes, or simply record them directly on the TiVo (at no extra cost above our normal cable bill) when they air again.

The Federation Enterprise Hybrid Cloud provides that same type of functionality. You can choose to deploy workloads or services on-prem or use public clouds like vCloud Air all from the same interface. Consumers of cloud services usually don’t care where the workload runs, they just want the ability to consume it based on criteria like cost, data protection, recovery levels, or performance just to name a few. We talk a lot about how the Federation Enterprise Hybrid Cloud provides choice, and this is an easy-to-understand example of how we provide that choice.

Of course, Federation Enterprise Hybrid Cloud doesn’t just stop at the creation of workloads. Cloud consumers can choose to back up or restore provisioned workloads on demand. They can also provision entire application stacks in a fraction of the time that it would have taken without a cloud infrastructure. And they can do this while having the visibility into the costs for all of the decisions that they make, helping them decide upon the most appropriate place for their workloads to run.

And so I bring it back to where I started. Hybrid cloud can be a complex topic. But I was able to explain it to my wife in terms she is familiar with, and she instantly understood the benefits. On that night everybody won—I got to explain the benefits of hybrid cloud in a new way that gave me the idea for this blog post, and we ended up with a new show to watch. Now if she’d ever give me the remote control, I might actually be able to watch it.

The post How TiVo Taught My Wife About Hybrid Cloud appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/matt-_liebowitz/how-tivo-taught-my-wife-about-hybrid-cloud/feed/ 0
EMC Enterprise Hybrid Cloud Helps You Replatform https://infocus.dellemc.com/matt-_liebowitz/emc-enterprise-hybrid-cloud-helps-replatform/ https://infocus.dellemc.com/matt-_liebowitz/emc-enterprise-hybrid-cloud-helps-replatform/#respond Thu, 11 Dec 2014 15:00:36 +0000 https://infocus.dellemc.com/?p=22059 When you think of the hybrid cloud computing and how you might envision using it in your organization, you might first conjure thoughts of the next generation of applications. Or maybe you want to provide your developers with easy access to provision new servers or applications on the fly without needing to wait for IT. The […]

The post EMC Enterprise Hybrid Cloud Helps You Replatform appeared first on InFocus Blog | Dell EMC Services.

]]>
When you think of the hybrid cloud computing and how you might envision using it in your organization, you might first conjure thoughts of the next generation of applications. Or maybe you want to provide your developers with easy access to provision new servers or applications on the fly without needing to wait for IT. The EMC Enterprise Hybrid Cloud is built from the ground up to provide this kind of functionality (and more), but there are a lot of other potential use cases (some of which I discussed in an earlier blog).

In fact, when I think about the EMC Enterprise Hybrid Cloud, it makes me want to party like it’s 2003.

End of Support for Windows Server 2003

Windows Server 2003 will reach what Microsoft refers to as End of Support on July 14, 2015. After that date, customers will need to pay for a custom support agreement from Microsoft in order to continue to receive any support on that version of Windows. You might be sitting there thinking, “Who cares? How many servers running Windows Server 2003 could still be out there?” The answer: a lot.

In fact, we’ve spoken with customers that still have thousands of servers running Windows Server 2003. It’s a major effort to migrate from Windows Server 2003 to a more modern version like Windows Server 2012, bringing customers dangerously close to the End of Support date for Windows Server 2003.

Let’s talk about how the EMC Enterprise Hybrid Cloud can help.

To The Cloud!

Imagine you’re that organization that still has thousands of servers running Windows Server 2003. This isn’t a case of needing to migrate off the underlying hardware, so you can’t simply convert these servers to virtual machines and call it a day. To solve this problem, you actually need to deploy new servers running a modern version of Windows and replatform your applications. Without a modern cloud infrastructure like the EMC Enterprise Hybrid Cloud, that likely means a lot of manual provisioning of new virtual machines using lots of spreadsheets. IT becomes the bottleneck in terms of provisioning the servers and making sure the individual requirements of each server are met.

With the EMC Enterprise Hybrid Cloud, IT can allow application owners to provision their own servers at their own pace all through a service catalog. Instead of managing spreadsheets that are prone to human error, you can let the folks who know the systems better do the actual provisioning of servers. IT is no longer the bottleneck. Multiple application owners can provision servers to their specific requirements on their own which can significantly increase the speed of server deployment.

Measure and Monitor

Provisioning new servers is just one step in the process. You’ll also want to be able to monitor these servers better than you could before. Part of the EMC Enterprise Hybrid Cloud solution is VMware’s vCenter Operations (now referred to as vRealize Operations), which provides enhanced visibility into the performance characteristics of your servers. All workloads provisioned onto the EMC Enterprise Hybrid Cloud are automatically monitored by vCOPs, providing realtime visibility into their performance characteristics. It also provides another valuable piece of data: capacity planning. vCOPs can look at the workloads that have been provisioned and make a determination as to whether or not they are oversized. This opens up the opportunity to right-size servers that the application owner may have oversized.

Another area where the EMC Enterprise Hybrid Cloud helps is with providing the true cost of deploying and maintaining these servers. With the EMC Enterprise Hybrid Cloud, you can assign costs to each component in a server and provide the true cost of deployment and maintenance. By providing the cost of each server, application owners can easily compare the cost of that server running on the EMC Enterprise Hybrid Cloud with the cost of running the same server on a public cloud like vCloud Air—allowing them to make intelligent decisions about where their workload should run. You could then use the cost models to charge the business units for the servers they provision.

Many organizations are not quite ready for a chargeback model but still want to be able to report on the costs. The EMC Enterprise Hybrid Cloud can provide “showback” (or my favorite, “shameback”) costs to the application owners so they can see what this would cost in case your organization ever adopts a chargeback model.

Don’t Forget Backup

You’ve gone to the trouble of provisioning hundreds or even thousands of new servers to support your Windows Server 2003 replatforming effort. You’re monitoring these new servers better than you have before, and you can even calculate the cost of each. Clearly these workloads are important to you so you probably want to protect them, right? There’s an app for that!

In my last post I talked about what makes EMC Enterprise Hybrid Cloud different from other hybrid cloud solutions. One of those things is our custom workflows that integrate Avamar directly into the EMC Enterprise Hybrid Cloud service catalog. Without the EMC Enterprise Hybrid Cloud, an administrator might need to add each provisioned server to a specific Active Directory group or Organizational Unit, or they might need to choose specific backup policies within Avamar for every server. That sounds awful.

The EMC Enterprise Hybrid Cloud has the ability to let administrators and application owners choose their backup policy when they’re provisioning the workload, automating a series of manual tasks and saving a significant amount of time. Even better, it gives control over backup and restore to the owners of the servers. They can simply navigate to the EMC Enterprise Hybrid Cloud service catalog and choose to run a backup or restore a previous backup rather than needing to involve IT and potentially having to wait.

Hopefully, this example has helped paint the picture of why I think the EMC Enterprise Hybrid Cloud  can significantly aid in Windows Server 2003 replatforming. It can help speed up the deployment of new servers, provide better visibility into the performance and cost of these servers, and automate backup and recovery, too. The efficiencies gained can help customers move much more quickly to avoid the July 2015 End of Support deadline.

Of course, replatforming requires more than simply provisioning new servers with an updated operating system. You need to plan, migrate data and applications, test, and much more. EMC Global Services has a series of services that can assist you with your replatforming efforts all the way from planning and discovery to actual application and data migrations. Add the EMC Enterprise Hybrid Cloud into that mix, and you have a powerful combination.

The post EMC Enterprise Hybrid Cloud Helps You Replatform appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/matt-_liebowitz/emc-enterprise-hybrid-cloud-helps-replatform/feed/ 0