By Anthony Spiteri, Global Technologist, Veeam
The global pandemic has irrevocably changed the way businesses everywhere operate, tightening the link between a robust IT infrastructure and business continuity.
However, the transition has not been seamless with many businesses unable to adapt to the new environment without downtime. In July alone, Australians reported a net loss of $12.3 million from more than 18,500 scams.
With all this going on, it’s surprising that many IT teams are still relying on legacy back-up solutions.
Most legacy backup solutions in the market today are difficult to use. By putting too much effort into backup, IT admins lack the time, resources and simply the energy to tackle real business challenges.
Today’s backup technology landscape is now more agile and multifaceted than ever, offering tons of options for any size and budget, and consequently making it very hard to make the right choice. These days, IT needs to think not one, but several steps ahead, taking ransomware, vendor lock-in, storage capacity and cloud mobility, as well as unpredictable world economical and health factors, into the equation.
To add a cherry on top, the events of 2020 have now compounded these pressures like a match to a tinderbox. Data protection now needs to be a priority.
Below are four of the main challenges facing the industry today as it still relies on legacy solutions.
1. Dealing with unreliable back-up
Under the pressure of the pandemic, IT teams need to ensure employees work is still backed up from home. This is not an easy job when using legacy solutions, which use 20 year old code and try to retrofit it for the IT challenges of today.
In addition, dedupe databases often become error-prone and can cause complete data loss. Many solutions lack data recovery verification or only provide this availability for limited platforms. Another issue is visibility into what is working and what’s not. All too often, IT admins only find out there’s an issue when it’s too late.
Many companies chalk this up to a lack of training or skills. This is untrue. If a backup solution is stable, reliable and easy to use, then you shouldn’t need a Ph.D. to use it.
2. The cost of protecting data
Data protection can be quite costly when you take into account, hardware, software and storage costs.
Not to mention the less tangible, often forgotten costs: downtime and data loss. For example, a recent study showed that one hour of downtime for a high-priority application is estimated to cost $67,651. This number is $61,642 for a Normal application.
In addition, downtime and data loss can have an impact on your relationship with your customers or damage to brand integrity.
3. Seeing a return on investment
It can sometimes be hard to see a return on investment, when it comes to data backup.
An ROI consideration for your benefit is data reuse. All data protection solutions encapsulate a great deal of data. In today’s ecosystem, data is power, and the right ROI calculation isn’t simply crunching the numbers of time saved versus money invested, but also the value provided by putting your data to work.
4. Time and resource drought
IT teams are also challenged with a lack of time and resources. With so much going on, the last thing they need to be doing is ‘babysitting their backup.’ Far too many products in this industry are hard to use and complicated.
Another important factor is your backup software needs to be able to evolve with the organisation. If adding a new NAS device or changing cloud storage requires you to change your data protection strategy, spend time re-educating IT staff.
It is no longer an option for businesses to make do with older legacy solutions. Businesses are being forced to digitally transform and adopt all sorts of new technologies in order to survive and thrive during the work from home era.
It’s time for IT teams to step up and introduce back-up solutions that are reliable, simple and flexible.