Cost Opimization for 1 scenario when backing to AWS for Commvault Backups

What is one way to optimize costs? – you can make a measure of both of how many copies of the data need to be copied per retention standard. With full deduplication, a copy measure of apprixmately 1.XX is a good measure of efficiency. In addition to using Copy Measure, you can always measure the cost in TB/Months. The lower it costs relative to other types of storage classes, the more efficient you can do your backups.

Ex. Our retention standard is 15 months. In order to minimize copies, you can create a new backup set in which types of backups occur at the following schedule: 1x Full every 15 month, in which it is retained for 30 months. The reason why it needs to be retained is that say after 15 months, if the full copy retention is deleted after 15 months, the differential chain will not work. As differential backups are cumulative backups of data, there can be a different retention standard for Differential of only 15 months. This way, It is optimized.

Ex. Our retention standard is 15 months. To be even more efficient and only store 1 copy of the full, you can create 1x full every 30 months and have a retention standard of 30 months.

As always, there are alway tradeoffs in using any approach.

Leave a Reply

Your email address will not be published. Required fields are marked *