Partner News

High Availability is not a Luxury

Eliminating Downtime for Small and Mid-Market Organizations

Information technology (IT) provides enormous value for the small and mid-market business, but it also represents a tremendous point of weakness. When markets are global, employees work around the clock and business is effectively always on, any interruption to application availability can quickly lead to lost revenue, lost productivity, lost brand value, and regulatory problems. Taken to the extreme, extended downtime can even threaten the survival of your business.

How then should your organization deal with this type of existential threat? The painful reality is that most organizations do not deal with it well.

Business continuity — the planning, preparation, and implementation of more resilient business systems in anticipation of unscheduled downtime — is often thought of as an IT problem, and most organizations leave it to the IT department to provide a fix. This invariably leads to the deployment of a wide range of tactical solutions, with no overriding strategy providing guidance. In reality, as the term implies, business continuity is a business problem, and it requires a business approach to fix it.

Here’s a quick way to figure out if your existing business continuity plan is leaving you exposed:

  • If your plan requires significant manual intervention, you’re exposed; 
  • If your plan accepts loss of data beyond a few seconds for critical systems, you’re exposed; 
  • If your plan cannot restore access to critical systems in minutes, you’re exposed; 
  • And, if your plan depends on 30-year old backup and recovery technology, you’re definitely exposed

Backup and recovery has been the go-to technique for protecting IT systems for 30 years, but it was developed in a much simpler time. Backing up data to tape or disk, or performing snapshots — the modern equivalent of a backup — creates a point-in-time image of application data. Restoring from a point-in-time copy is never going to bring your data more current than the most recent backup. Whether your copy is from 15 minutes ago or is two days old, recovering from a backup means you must face the consequences of data loss. That may be OK for some systems. But, for many of your most important business applications, data loss will be catastrophic.

Backup and recovery techniques were developed for relatively unsophisticated computing processes, back when there were regularly scheduled periods of time when no one would be using the system. The always-on business applications that you now rely on for day-to-day operations need a technology that guarantees continuous system availability and eliminates the threat of data loss, without relying on a backup window.

Modern high-availability (HA) technology continuously streams application and data changes to a remote location. When disaster strikes, be it an earthquake, a power outage, or a bungled software install, failover to an up-to-date copy of your system is automatic and instant. HA eliminates downtime and eliminates data loss.

Read the full article here:

High Availability is not a Luxury

Five Fundamentals for Modern Data Center Availability

Introduction

Company executives, including CIOs and CFOs, have zero tolerance for downtime and data loss. These companies have established high availability requirements for the applications and critical data the organization uses on a daily basis. Sadly, most companies have not found a way to match these expectations with the harsh realities of maintaining the demands of the Always-On Business™. In fact, 82% of CIOs admit to not being able to meet the demand for 24/7 availability of IT services. Protecting your company’s applications and data is more complex than it was in the past and more important than ever. With the introduction of new technologies including virtualization, cloud computing and the advancement of more applications, the aging legacy backup solutions in use today are simply inadequate. This can result in unnecessary risk and pain for companies and IT administrators. This white paper will describe how data protection has changed and how availability solutions are now available that are much more reliable and efficient. Don’t take data protection for granted. Continue
reading to learn about the five fundamentals of modern data center availability.

Use an availability solution built for modern data centers. There are hundreds of data protection tools out there, but few of them are “virtualization-savvy.” Legacy data protection tools tend to see every “server” the same way — as a physical server. By incorrectly assuming that all servers are the same, tremendous inefficiencies occur when you attempt to back up
or recover applications and data. For example, lengthy file-based backups are performed when only small blocks of data have changed.

Five Fundamentals for Modern Data Center Availability

EMC Isilon OneFS Operating System

We are seeing an explosion in the growth of data today. Not surprisingly, many industry experts believe that we have entered a new era of Big Data. Along with accelerating growth of new data, the composition of new data is also changing significantly from traditional structured, block data to much more unstructured, file-based data. By 2015, it is anticipated that in excess of 85% of new storage capacity installed in organizations around the world will be for file-based data.

This new world of Big Data is introducing major challenges for enterprise IT managers as well as significant opportunities for businesses across all industry segments. To deliver the optimal storage platform for Big Data, a storage system must provide:

Massive capacity: To accommodate very large and growing data stores

Extreme performance: To minimize response and data ingest times and thereby keep up with the required pace of the business.

High Efficiency:
To reduce storage and related datacenter costs

Operational simplicity:
To be able to manage a growing, large-scale data environment without adding more IT staff.

While there are certain similarities with the needs of vertical industries’ Big Data, traditional Enterprise IT has its own set of business drivers that create a unique set of storage requirements including:

Data Security: To minimize risk and meet regulatory and corporate governance requirements

Data Protection: To ensure business continuance and availability to support business operations

Interoperability: To increase business agility and to streamline management

Predictable Performance: To increase productivity and better support business requirements

Today, the clear delineations that have existed between big data requirements and enterprise IT requirements have now blurred to the point that they are no longer distinguishable. The simple fact is that these two worlds are rapidly converging, creating a need for a fundamentally different way to meet the storage needs that enterprises will have going forward. To address these needs, organizations require an enterprise scale-out storage infrastructure that can meet the combined needs of this new world of Big Data and traditional Enterprise IT. We call this the “scale-out” imperative.

EMC Isilon OneFS Operating System

Veeam Helps Vodafone Netherlands Ensure 24×7 Availability of Telecommunication Services

“I think most of the IT team would agree that our favorite Veeam feature is seamless integration with HP StoreVirtual. I believe this integration will be one of the most useful aspects of our IT infrastructure.” — Jan Spapens, Senior Program Manager, Technology Enterprise Solutions

The Business Challenge
Vodafone Netherlands operates in a dynamic and competitive environment that requires fast adaptation to customers’ changing needs. To automate business operations such as telecommunications delivery, customer support and billing, Vodafone virtualized the IT infrastructure on VMware vSphere. However, protecting the production environment became more difficult as the infrastructure approached 90% virtualization and the number of virtual machines (VMs) increased. At the same time, yet unrelated to virtualization, the customer base grew and data tripled. Vodafone’s legacy backup and monitoring tools couldn’t keep pace, resulting in slow backup and recovery.

The vSphere environment has hundreds of VMs spread across multiple data centers in the Netherlands that run Microsoft SQL Server, Oracle, SAP, Zend Server and a proprietary back-office portal and service navigator. HP ProLiant servers host HP StoreVirtual VSA storage.

“Only a limited number of VMs could be backed up daily, and they were strictly production VMs,” said Jan Spapens, Senior Program Manager, Technology Enterprise Solutions, Vodafone Netherlands. “Backup jobs failed about once a month, requiring us to spend time trying to access our backups. We had no easy way to perform file-level recovery and no clear way to monitor unprotected VMs.”
Vodafone put together a list of requirements for data protection and VM management, and Nikola Stojanovski, Technical Specialist, Vodafone Netherlands, reviewed several backup solutions to find the right match.

“Veeam was a perfect fit for us,” Stojanovski explained. “We needed the best data protection for our production environment, and Veeam offers frequent and reliable backup, storage integration, a smaller storage footprint, high-speed recovery, replication for failover and improved monitoring and reporting. We didn’t have to make any hardware investments, and Veeam integrates seamlessly with HP StoreVirtual.”

Downloads

What’s so special about NTP Software Precision Tiering?

Flexibility and Integration at the Source

NTP Software is the only company that has made the investment to work with the major storage manufacturers and create extensions and interfaces that are part of the storage host OS’s of NetApp, EMC, HDS and Windows – storage platforms that collectively host over 85% of all file data.

Our tight integration with these platforms enables the variety of tiering strategies (event-based, age-based, user-driven, quota-driven, application-driven, etc.) that you will read about below. In addition, we have interfaces for CIFS, NFS and REST that allow us to include almost all of the less common storage repositories in our architecture.

This means that you can move files based on what your users actually do, rather than having to conform to a technology’s limits – and from almost any source to just about any destination.

Why limit your present or your future?

Many Destinations, All with the Same Results

Unlike elephants, it is rarely the case that all files go to the same place to die. Just as you may well have different primary data sources, over time, you are likely to want different Tier 2 and archive destinations. NTP Software Precision Tiering accommodates: File Systems, Tape, Object Stores and Cloud destinations. We can even send copies of individual files to multiple destinations.

Most organizations will ultimately use tape, cloud, and file/object technology. With NTP Software Precision Tiering you can avoid putting yourself in the position of needing to bear the costs of three infrastructures to do this.

Downloads

Secure, Enterprise File Sync And Share With Emc Syncplicity Utilizing Emc Isilon, Emc Atmos, And Emc Vnx

Abstract

This white paper explains the benefits to the extended enterprise of the on-¬‐premise, online file sharing storage solution from EMC Syncplicity using EMC Isilon, EMC Atmos, and EMC VNX. It also discusses solution deployment and load balancing options.

Executive summary

Today’s corporate employees expect to have access to data and services across all their devices as if they were working at the corporate office. Online file sharing (OFS) is growing in popularity because it helps users access their work files from any device. However, to achieve this, IT must adopt enterprise-¬‐grade solutions that give users the access they need while providing the security and controls that protect company data.

To deliver this level of control and protection, EMC® Syncplicity® has launched an on-¬‐ premise storage solution that combines the unmatched flexibility and ease of use of EMC Syncplicity’s cloud-¬‐based file sync and sharing technology with a secure, on-¬‐premise storage infrastructure based on EMC Isilon, EMC Atmos and EMC VNX storage. EMC VNX provides a powerful, yet simple way to manage data and applications, enabling enterprises to expect much more from their storage. EMC uniquely provides enterprises a single vendor end-¬‐to-¬‐end solution to simplify deployments and to provide a single point of contact for your online file sharing needs.

Downloads

Unitrends Service Provider Program Achieves 278 Percent Revenue Growth in its First Year

Program enables service providers to drive recurring revenue through high-margin data protection services

BURLINGTON, Mass. , August 25, 2014 Unitrends today announced that its Service Provider Program experienced 278 percent revenue growth in its first year. Launched in August 2013, Unitrends’ Service Provider Program enables hosting, managed service and cloud providers to generate recurring revenue through high-margin service offerings, including Disaster Recovery as a Service (DRaaS), Backup as a Service (BaaS) and Replication as a Service (RaaS).

Hybrid Cloud Backup for Complete Disaster Recovery Assurance
Leveraging Unitrends’ capability stack, service providers can easily and affordably build a hybrid cloud backup offering that enables them to become a full-service disaster recovery partner to their customers. Flexible on-premise appliances achieve customers’ recovery point objectives (RPOs) and recovery time objectives (RTOs), while data replication to the cloud and disaster recovery spin-up in the cloud provide complete recovery assurance and automated business continuity. Unitrends’ capability stack also offers 100 percent automated disaster recovery testing and pushbutton failover, ensuring restoration within defined service level agreements (SLAs).

“Non-intrusive disaster recovery testing is one of the major advantages of Unitrends’ Service Provider Program,” said Stephen Evans, IT director at DirectCloud, one of Unitrends’ managed service provider (MSP) partners. “We can automate failover and recovery testing based on RTOs and RPOs at any time without impacting our customers’ production environments. A lot of vendors provide virtual backup and recovery, but Unitrends enables us to support our customers with backup and recovery assurance that meets defined SLAs no matter how diverse or complex the environment. That’s a significant competitive advantage for us.”

Downloads

VIRTUAL SERVER BACKUP SOFTWARE BUYER’S GUIDE

The Insider’s Guide to Evaluating Virtual Server Backup Software
By Charley McMaster and Jerome Wendt

With the proliferation of virtual machines (VMs), organizations are appropriately concerned about their ability to backup and restore the growing amount of data that they have residing on these VMs. As such, companies are looking for virtual server backup software solutions.

The benefits of server virtualization are already well known. Power reduction, hardware optimization, improved application availability and lower upfront capital and ongoing operational costs are just some of the benefits that organizations already enjoy. Given these benefits, it comes as no surprise that Gartner estimates that by 2016 as high as 86 percent of server workloads will be virtualized by 2016.

Since the release of the DCIG 2013 Virtual Server Backup Software Buyers’ Guide, there has been a steady stream of new software releases improving the capabilities of these products; acquisitions to strengthen existing product portfolios; and, divestitures to others interested in entering the virtual server backup software market. In just the last year, some of the changes and transactions that have taken place include:

  • Computer Associates announced its intentions to divest itself of its arcserve Unified Data Protection software to Marlin Equity Partners
  • Data protection company Infrascale purchased Eversync Solutions
  • Unitrends acquired PHD Virtual, a virtual server backup software provider and made their two respective products available as a single product, the Unitrends Certified Recovery Suite (UCRS)

Virtual server backup software providers continue to push the boundaries by introducing new technology to set them apart from others. A telling sign of the ongoing innovation in this space is products in this Buyer’s Guide are more competitive than ever.

What lagged in a product’s ability in the prior Buyer’s Guide may now be its defining feature helping to position that product ahead of others in this Guide. Virtual-to-physical (V2P) and physical-to-virtual (P2V) recovery are now standard on many products, along with support for the VMware APIs for Data Protection (VADP), instant recovery, deduplication, vSphere integration, and vCenter Server backup and restore capabilities.

Advancements are seen in other areas as well. One is greater management capabilities using Microsoft Volume Shadow Copy Service (VSS) for certain applications such as Microsoft SQL Server. Another is enhanced management capabilities using vCloud Director to help create and managed virtual datacenters.

Many virtual server backup software products now also offer other key features such as:

  • Connectivity to cloud storage service providers to store data in the cloud
  • Increased restore granularity for Exchange and SharePoint applications
  • Virtual machine (VM) recovery to the other hypervisors and even the cloud
  • A wider level of protection for more hypervisors and guest operating systems

These changes, coupled with ongoing virtualization of all size data centers, have end-users, resellers and vendors alike clamoring for an update to the DCIG Virtual Server Backup Software Buyer’s Guide.

Downloads

Stubs – the good, the bad and the ugly

An NTP Whitepaper

We’ve all heard the saying, “The devil is in the details.” Nowhere is this truer than it is when talking about the stubs used in tiering. Stubs require the cooperation of the storage hosts, the network, protocols, security, end-user applications, and client systems. What works for one may or may not work for another.

And there are the issues that only emerge in the long – term. Someday the file’s original host system will be gone. What then?

This paper addresses all of these issues and more…

Downloads

Reducing Primary Storage Costs and Complexity with NTP Software Precision Tiering

An NTP Software White Paper

Allocating resources to high-cost storage for low-use data doesn’t make sense. Most organizations would rather match storage to data and use the saved budgetary resources for other projects. NTP Software Precision Tiering™ makes it easy for you to find and migrate data automatically to the most appropriate storage tier, ensure proper archiving of valuable data, meet compliance and governance requirements, and ensure proper data protection.

Executive Summary
Over the last few years, file data (also called unstructured data) has grown explosively. On most networks, file data accounts for 85% or more of the total storage requirements. Managing this data effectively is now one of the most important tasks for many organizations.

For years, migrating data from primary storage to other storage tiers has been a part of many companies’ data management strategies, both for data protection and as a way to control costs. However, with the explosive growth of file data, managing the costs and resources of data migration have made the benefits more difficult to recognize.

The cost of conventional data migration methods is essentially proportional to the amount of data in primary storage. As the quantity of file data increases – which it continues to do at an average rate of about 50% a year (per Gartner Group) – the required resources of conventional data migration methods increases commensurately.

In fact, some end-user organizations have already reached the point where their data migration windows are too small to accommodate all of the processes that need to be done to maintain their data migration strategy.

The solution – efficient data migration – is a paradigm shift to Precision Tiering. With Precision Tiering, the majority of the operating cost and resources demanded by conventional data migration are eliminated, while all of the benefits are preserved. Precision Tiering makes it easy for you to find and migrate data automatically to the most appropriate storage tier, ensure proper archiving of valuable data, meet compliance and governance requirements, and ensure proper data protection.

Downloads