Author Archives: Barry Goodwin



Is the future of computing in the cloud? Increasingly, it appears to be headed in that direction – and for good reason. Cloud computing presents numerous cost, agility, and operational advantages that are undeniably compelling. In fact, in a vast majority of enterprise data centers, cloud-like architectures are quickly taking root. Companies are virtualizing their resources and partitioning some of their applications within their four walls.

However, risk-averse enterprise IT professionals are understandably cautious about simply moving their entire IT portfolio of resources and services into a 100-percent cloud architecture. As with most technology-adoption curves, enterprises have not embraced an “all-or-nothing” paradigm, instead preferring a pragmatic, “evolution, not revolution” approach of cautious incrementalism. This helps them account for their existing investments, custom work, business requirements, and risk posture. Not everything will move to the cloud – at least not in the foreseeable future. That’s why so many IT professionals advocate a so-called “hybrid” approach. But what do we mean by “hybrid” and how do we optimally balance the various components of a hybrid cloud architecture?

View the full document here: The Hybrid Cloud: A PRAGMATIC VIEW OF ARCHITECTURES

High Availability is not a Luxury

Eliminating Downtime for Small and Mid-Market Organizations

Information technology (IT) provides enormous value for the small and mid-market business, but it also represents a tremendous point of weakness. When markets are global, employees work around the clock and business is effectively always on, any interruption to application availability can quickly lead to lost revenue, lost productivity, lost brand value, and regulatory problems. Taken to the extreme, extended downtime can even threaten the survival of your business.

How then should your organization deal with this type of existential threat? The painful reality is that most organizations do not deal with it well.

Business continuity — the planning, preparation, and implementation of more resilient business systems in anticipation of unscheduled downtime — is often thought of as an IT problem, and most organizations leave it to the IT department to provide a fix. This invariably leads to the deployment of a wide range of tactical solutions, with no overriding strategy providing guidance. In reality, as the term implies, business continuity is a business problem, and it requires a business approach to fix it.

Here’s a quick way to figure out if your existing business continuity plan is leaving you exposed:

  • If your plan requires significant manual intervention, you’re exposed; 
  • If your plan accepts loss of data beyond a few seconds for critical systems, you’re exposed; 
  • If your plan cannot restore access to critical systems in minutes, you’re exposed; 
  • And, if your plan depends on 30-year old backup and recovery technology, you’re definitely exposed

Backup and recovery has been the go-to technique for protecting IT systems for 30 years, but it was developed in a much simpler time. Backing up data to tape or disk, or performing snapshots — the modern equivalent of a backup — creates a point-in-time image of application data. Restoring from a point-in-time copy is never going to bring your data more current than the most recent backup. Whether your copy is from 15 minutes ago or is two days old, recovering from a backup means you must face the consequences of data loss. That may be OK for some systems. But, for many of your most important business applications, data loss will be catastrophic.

Backup and recovery techniques were developed for relatively unsophisticated computing processes, back when there were regularly scheduled periods of time when no one would be using the system. The always-on business applications that you now rely on for day-to-day operations need a technology that guarantees continuous system availability and eliminates the threat of data loss, without relying on a backup window.

Modern high-availability (HA) technology continuously streams application and data changes to a remote location. When disaster strikes, be it an earthquake, a power outage, or a bungled software install, failover to an up-to-date copy of your system is automatic and instant. HA eliminates downtime and eliminates data loss.

Read the full article here:

High Availability is not a Luxury

Backup Vendor Clouds – cannibalising partners or a welcome addition to the portfolio?

Ben Cuppello – Sales Director

Firstly allow me to caveat…your view on the topic at hand is likely to depend on whether or not you are looking at the topic as a vendor, reseller or service provider leveraging vendor technology to deliver a managed service. I myself sit in the camp of a service provider – leveraging Arcserve technology to provide BaaS and DRaaS through the reseller channel.

There is no doubt that current technology and ever decreasing storage costs are making the prospect of leveraging a cloud based service for offsite backups more realistic, and a valid alternative to traditional tape based solutions. However should this cloud platform be provided by the backup vendor themselves? A partner MSP? If the vendor cultivates a network of MSP’s to offer the service should they then build their own, essentially competing with the partners they have previously encouraged? What would the end user prefer? What boxes does this platform need to tick for it to become a viable option?

We believe that in order to drive adoption of a cloud platform amongst end users it needs to provide a stable, cost effective option with tight integration into the backup technology along with the ability to satisfy compliance and security requirements. It also needs to be adequately supported by a responsive support team who are available around the clock.

An example of an organisation who delivers an effective solution would be Datto, however they have always attached a cloud offering to their appliances and never sought to develop MSP’s to leverage their technology as part of their own offering.

Arcserve technology is perfectly suited to delivering BaaS & DRaaS solutions with its ability to target almost any infrastructure endpoint, flexible deployment options, simple multi tenant reporting and slick service provider license model. Core DataCloud have built an offering which we successfully deliver to several hundred servers across multiple clients. Traditionally Arcserve have been a channel led software provider, having recently entered the appliance market they have now announced the availability of their own cloud platform in the US.
At this stage the platform is US only, meaning it’s impractical for the majority of UK based organisations to adopt, however it clearly represents a concern for us as its release in the UK could create a competitor out of one of our key vendor partners. Will there be features that are not made available to us which leave us at a disadvantage? How will it be priced?

The above is designed to be thought provoking rather than provide answers – I’m interested to see how people feel on this matter and whether or not, from a vendor perspective, it would be more profitable to grow their MSP network rather than create their own offering.

Best Practice Security in a Cloud-Enabled World


The cloud will be a growing part of your IT environment. This is inevitable, particularly in consideration of the following:

Economic Attractiveness – The economic attractiveness of cloud infrastructure services (e.g., Infrastructure as a Service – IaaS) is rising with each round of price wars. Lured by this attractiveness, will your organization be faced with escalating data protection, operational risk, management complexity, and compliance uncertainty?

Self-Service Proclivity – Findings from a late-2013 Frost & Sullivan survey show that 70 percent of end users are ignoring IT approval procedures and subscribing to un-vetted cloud services, for a variety of reasons: support business objectives, improve productivity, and foster innovation. Can your organization mitigate the security risks associated with uncontrolled and invisible use of Software as a Service (SaaS)?

Opportunities in the Internet of Things – The eve of the Internet of Things is here, as the ability to send and collect burgeoning streams of data is no longer technologically constrained. Similarly, producing actionable insights from this data mountain is no longer “on the horizon,” but aided by the elasticity of the cloud. Readying your organization to skillfully ride the twin waves of the Internet of Things and Big Data & Analytics cannot wait—but are you prepared to tackle the information security challenges that arise when the cloud is part of the equation?

Addressing these questions is feasible through proactive and sensible risk management. While information technology does move rapidly and with a degree of unpredictability, a comprehensive risk management approach, designed to flex and adapt, enables organizations to embrace cloud services with security confidence.

In this paper, we present a straightforward approach to cloud security. The structural foundation of this approach will not only assist in mitigating the risks associated with cloud deployments and usage, but also improve and standardize your security posture and practices across all your environments—public and private clouds as well as bare metal server clouds; and allow you to skip future security overhauls brought on by the emergence of new types of information technologies and security threats.


View the full document here: Best Practice Security in a Cloud-Enabled World

UK Cloud Adoption Trends For 2015

This Executive Brief is a summary of the white paper:

The Normalisation of Cloud in a Hybrid IT Market UK Cloud Adoption Snapshot & Trends for 2015


• Hybrid IT is an approach to enterprise computing in which an organisation provides and manages some information technology (IT) resources in-house but uses cloud-based services for others.


• A 2014 survey of 250 senior IT and business decision makers across small to medium sized busi- nesses and public sector organisations in the UK was commissioned by the Cloud Industry Forum (CIF). Its purpose was to reveal trends in the end user community with respect to adoption of cloud services.
• Of the surveyed organisations 20% had 5 to 20 employees and 8% had more than 5,000 employ- ees. The remainder of the organisations fell in between with the largest percentile (26%) having 51 to 200 employees.
• The surveyed organisations operated in a variety of sectors from IT and technology (15%) to hospi- tality (1%).


• The vast majority of organisations (79%) consider the cloud as part of their IT strategy. 72% of or- ganisations consider the cloud when refreshing infrastructure. 61% of organisations run Windows Server 2003 for which support will terminate in July 2015.
• 78% of organisations report running IT with in-house staff. 22% use a managed service provider with the highest concentration of these among organisations with less than 20 employees.
• The survey found that the primary business objective in migrating to the cloud was increasing flexibility in access to technology (80%) and increasing speed of access to technology (79%).
• The major inhibitors to migrating services to the cloud were reported as lack of budget (37%) and investment in legacy systems (34%) followed by security and privacy concerns (31%) and difficul- ties integrating legacy systems with cloud services (27%)
• 75% of organisations surveyed stated that they had security concerns in migrating specific appli- cations to the cloud. 59% reported concerns over data protection and 47% cited as a reason for not using cloud services that they had investments in on-premise systems.
• 9% of respondents reported cost savings in using cloud services but 18% anticipated cost savings in the next 5 years. The greatest number reporting expected cost savings were organisations with under 20 employees.
• 47% of organisations reported that they had a significant competitive advantage using cloud services. Among public sector organisations 71% reported that they had some advantage.
• Almost half the organisations surveyed that did not already use cloud services reported that they anticipated using cloud service applications within the next 12 months.

View the full document here: UK Cloud Adoption Trends For 2015

Vendor Branded Back Up Appliances

Vendor branded backup appliances – why?

by Brian Jones – Pre Sales.

Before I get started let me just limit what I will discuss in this blog, there are many areas of interest in this market on which I could comment; however today I want to look at the recent trend for a host of backup software vendors to launch their own branded backup appliances. This has been most noticeable recently among vendors who target the small and medium enterprise market – large enterprises having had purpose built backup infrastructure for decades now.

What I would like to look at are the motivations of the organisations concerned in launching these products, and whether they represent best practice and value for the end user.

A quick google revealed that (no doubt among others) Arcserve, Datto, Unitrends, Asigra, Commvault and Storagecraft all now have appliances available. Notable names missing from this list are Veeam and Zerto for the obvious reason that they are virtualised environment specific products that align themselves with a virtual deployment model.

We can readily see the advantages to these vendors in supplying their own hardware, preconfigured with their software offerings. I would not like to suggest that the order I discuss these below indicates the priorities of those organisations!

Profit – margin on the boxes themselves

Client tie in – a dedicated vendor specific appliance makes it less likely a client will switch to an alternate backup technology

Ease of administration – most appliances come with an ‘all you can consume’ licence, considerably reducing administration of some unwieldy licensing schemes

Ease of support – by standardising the hardware platform supporting the solution becomes easier

However from the point of view of the end user, what are the pro’s and con’s of the vendor appliance?


Improved support likely due to familiarity with the hardware platform

Optimal hardware configuration – clearly the vendor are in the best position to specify the appliance to maximise performance of the software. That said this information should be freely available anyway.

Ease of implementation: With pre-installed software and usually a wizard to guide the end user through their initial setup of the solution, vendor appliances are easier to get up and running – assuming the end user is doing this themselves.


Sizing – The vendors usually offer a range of appliances for different sizes of organisation, but it questionable whether an end user would obtain the same ROI as they could from a box specified for them specifically.

Scalability – Most of these appliances are not expandable, so should the end user fill the device they have no option but to delete data or buy another appliance.

Value for money – There is a food chain to support in the provision of these devices to market – vendors do not want to support a worldwide warranty/spare parts network so most source the hardware from the same factory in China, then brand it as their own. So do they represent good value for money as a piece of tin? Probably not; the end user is paying for the convenience of the product.


Conclusions: If an end user does not have the experience to build and deploy an appliance themselves, and does not have a partner organisation with this specialist knowledge and experience, then vendor appliances are a good choice for confidently implementing a backup and DR solution.

If however the end user has a specialist partner who can deploy and support a box specified individually for them, this is the preferable solution.


Five Fundamentals for Modern Data Center Availability


Company executives, including CIOs and CFOs, have zero tolerance for downtime and data loss. These companies have established high availability requirements for the applications and critical data the organization uses on a daily basis. Sadly, most companies have not found a way to match these expectations with the harsh realities of maintaining the demands of the Always-On Business™. In fact, 82% of CIOs admit to not being able to meet the demand for 24/7 availability of IT services. Protecting your company’s applications and data is more complex than it was in the past and more important than ever. With the introduction of new technologies including virtualization, cloud computing and the advancement of more applications, the aging legacy backup solutions in use today are simply inadequate. This can result in unnecessary risk and pain for companies and IT administrators. This white paper will describe how data protection has changed and how availability solutions are now available that are much more reliable and efficient. Don’t take data protection for granted. Continue
reading to learn about the five fundamentals of modern data center availability.

Use an availability solution built for modern data centers. There are hundreds of data protection tools out there, but few of them are “virtualization-savvy.” Legacy data protection tools tend to see every “server” the same way — as a physical server. By incorrectly assuming that all servers are the same, tremendous inefficiencies occur when you attempt to back up
or recover applications and data. For example, lengthy file-based backups are performed when only small blocks of data have changed.

Five Fundamentals for Modern Data Center Availability