Data protection is a requirement for every organization, however, that conversation has evolved over the years. In the past, it used be a lot simpler since there were a lot less factors to consider. By now, every data protection vendor can do a pretty good job at backing up the solutions they support, however, how many of them can recover fairly quickly, or instantly? How many of them offer ransomware alerts? How many offer immutability on the backup copies? How many can leverage that isolated backup data for other purposes (data analytics)? How many can restore to a different platform (public cloud, for example)? And the list goes on and on…
Never have I ever seen an IT leader reaching out to replace their current legacy solution with the latest, greatest, fastest data protection solution out there… I am talking about willing to go to battle with the procurement team to get that purchase approved kind of excitement. Let’s face it, backups are usually just an insurance policy until that time that your business truly relies on it. Nowadays you can also argue that you can do a lot more with this backup data, but that would be considered an added bonus in my opinion.
Most people are not going to push harder than usual for this purchase until the unexpected happens and the business realizes how important it is to be able to be back up and running in the shortest amount of time. I personally think that the restore speed should always lead that conversation. The days where backups were just in place to cross off a compliance checkbox are over with. The question is no longer if you will have to restore from backups because there was a breach, the question is WHEN?
If you are not sure, I suggest you discuss with your team what that RTO expectation looks like. How much downtime can each application sustain? Different technologies may have different expectations. Some may be recoverable within seconds, other within minutes, some within hours!