NovaStor Logo

Looking for Enterprise Backup? Visit NovaStor.com

NovaBACKUP-blog

NovaBACKUP Blog

Stay up-to-date on all things backup and more!

 

Who Do You Think Won the One Million File Backup Challenge?

by Bridget.Giacinto, on Jul 25, 2014 3:18:04 PM

Have you ever wondered why two sets of files of the exact same cumulative file size can take drastically different amounts of time to backup? It all comes down to the size of the files being backed up. With all things equal in terms of the total cumulative file size, a backup set that contains a large number of small files will take longer to backup than a backup set that contains a smaller number of large files. Why? The primary reason is fragmentation, but additionally the more files you have, the more resource overhead is required to open the files, read the Meta data, allocate and record the data of each and every file. A large portion of disk activity may simply be spent searching for the location of the file, rather than performing the actual backup. With all things equal, it really comes down to the efficiency of your backup software, which is what prompted us to do this benchmark study.

Our quest was to find out how NovaBACKUP stacked up against Acronis and Symantec in a head to head test of speed, CPU usage and peak memory usage when tasked with backing up a million small files to a local disk drive. An internal benchmark study was conducted by NovaStor Engineers to test this specific scenario, to rate the efficiency of our backup software in comparison to our competitors. The results may surprise you.

1,000,000 Small File Backup Challenge1,000,000 Small File Backup Challenge

Results of the 1,000,000 Small File Backup Challenge:

Million File Backup ChallengeWhy You Want Backup Jobs to Run Quickly

The IT System Administrators, or whoever is responsible for backing up your data, will most likely be tasked with ensuring that backups occur in the shortest possible amount of time with minimal impact on a system’s overall system performance. This is often done by first establishing a backup window. A backup window is a predetermined time slot allocated to backing up specific data, applications, databases, or systems, which will not operationally disrupt performance. The faster your backups are, the greater the chance that as your file size grows…and it will grow…you will be able to maintain that backup window.

Establishing a backup window is done by monitoring and identifying a system’s peak and non-peak CPU usage. If backups cause high CPU usage, you have no other choice but to schedule backups during non-peak hours or they will, without a doubt, disrupt performance. The lower the CPU utilization the more flexibility you have in terms of when you can run your backup jobs.

In our one million small file backup benchmark study, we looked at how much time it took to each backup software program to run the exact same backup job. NovaBACKUP Business Essentials took the least amount of time, which equates to 130% faster than Symantec and almost 40% faster than Acronis.

Faster-Backup-Speeds

Why is High CPU Usage an Issue?

If you are experiencing high CPU utilization when running backup jobs, you are likely experiencing sluggish system performance or unexpected errors. When a servers CPU is working at or above 80-90% utilization, applications on your server are likely to experience slow-downs, no response, errors, or even failed backups. If your CPU utilization maxes out at 100% when you run your backup jobs, your server will be forced to free up processing power from other processes, potentially causing server performance issues and failed backup jobs. To verify what process is causing the high CPU usage, you can use the task manager to see the current processes that are running on your Windows system. Press Control + Shift + Esc to access the control manager and then click on the Processes tab. To determine which process is eating up your CPU, click on the header in this window called “CPU” to sort usage by highest CPU usage.

In our one million small file backup benchmark study, NovaBACKUP used 6x less CPU usage than Symantec and 1.2x less than Acronis.

High CPU Usage

Why You Don’t Want High Memory Usage

Excessive memory usage, even with an average server with 16GB of memory or more, can result in memory shortages. Memory shortages develop when multiple processes are demanding more memory than is currently available or you are running poorly written application that leak memory. If you are running a lot of programs simultaneously, as most of us do, you need a backup software that is light on memory usage so that it can run in the background without hogging all of your systems memory.

In our one million small file backup benchmark study, NovaBACKUP used 26x less peak memory usage than Acronis and 20x less memory usage than Symantec.

High Memory Usage for Backups

 

Here are the full results breakdown:

NovaBACKUP vs Acronis True Image 2014

  • NovaBACKUP is 39% faster to backup speeds than Acronis
  • NovaBACKUP uses 1.2x less CPU power than Acronis
  • NovaBACKUP uses 26x less memory during backup than Acronis


NovaBACKUP vs Backup Exec 2014

  • NovaBACKUP is 130% faster to backup speeds than Symantec
  • NovaBACKUP uses 6.3x less CPU power than Symantec
  • NovaBACKUP uses 20x less memory during backup than Symantec


The Clear Winner on All Counts:
NovaBACKUP Business Essentials

Learn more about NovaBACKUP Business Essentials


Benchmarking Environment:

Windows 7
Intel® Core™  i7-4770 CPU @ 3.4 GHz
Installed memory (RAM): 16 GB
Backup Location: Local Disk Drive

Categories:Pre-Sales QuestionsIndustry News

NovaBACKUP Blog

The NovaBACKUP blog is focused on providing insight on data protection that is relevant to the SMB market and to managed service providers. 

Visit NovaStor blog for enterprise posts
Sales-support-icon
Talk to a Backup Expert
Our support engineers are here to assist you.

Request 30-Minute Consultation »

training-icon
Request a Trial
Get a free trial of our software in your environment.

Request a free trial »

Newsletter Signup