Optimizations for keeping your backups in HubStor

August 27, 2019 Geoff Bourgeois Cloud Storage, segregated backup, shared storage 0 Comments

The cloud offers distinct advantages for enterprise backup and disaster recovery strategies, such as convenient offsite storage that is both economical and reliable.

Part of our customer-centric approach is taking notice of different ways our customers are using HubStor and getting a better understanding of their needs so we can enhance our cloud data management platform.

This process leads to interesting new capabilities that make it easier to leverage HubStor to manage their data. Customers use HubStor for backup in the following ways:

  1. As a backup solution for shared storage.
  2. To protect data residing within VMs.
  3. As a backup for cloud-based data residing in Azure Blob Storage accounts and their Office 365 data protection.
  4. As reliable storage for backups created by other backup software, often to serve as either offsite protection or for long-term retention in a cloud archive instead of on-premises disk or tape.

With more customers using HubStor for management of their backup sets, we’ve released several improvements to the platform for this use case. These new developments are currently available and include:

  • Support for any file size – HubStor previously had a post-compression file size limit of 4.7 TB but now any file size is possible.
  • Ability to seamlessly resume large files transfers – Nobody wants to start from scratch when a large file transfer is disrupted. So, if your network connection drops during capture of a 10 TB file, HubStor will resume the file transfer from where it left off within the file.
  • Better write performance – HubStor now supports an optimized ingestion process for very large files which is just as secure as the standard ingestion methodology. However, the large-file optimized approach does not place any load on your App Service. Additionally, the HubStor Connector Service (HCS) now supports single-threaded operations on large files that will max out the throughput of the available machine and network resources.
  • Better read performance – Previously if you were running the HubStor Export Utility on a machine in the cloud and retrieving large files, you would see a download rate of 200 Mbps. Now, HubStor delivers 12x the speed—this same scenario yields sustained download rates of 2.4 Gbps compressed (3+ Gbps logical).

Microsoft recently lowered Archive tier pricing for all regions. In some cases, storage economics in Azure are as low as $0.00099/GB/month. For one petabyte of data, that works out to a storage cost of around $12,000 annually.

With cloud storage offering such a distinct economic advantage, now is the time to explore using HubStor as part of your backup and disaster recovery strategy. Reach out to us to speak with one of our cloud storage experts today.

About the Author

Get Notified

Recent Posts

RSS

Add to your favorite RSS reader using: blog.hubstor.net/rss.xml or http://blog.hubstor.net/rss.xml