Windows Server Backup Software. Welcome to GRSoftware site
By Dr. Dallas E. Hinton
As any experienced computer user knows, the question is not “Will I lose data?” but “When will I lose data?”
The question I prefer to ask is “How can I recover from losing data?”
Keeping good backups is critical to coping with data loss. It is estimated that 43% of all businesses suffering major data loss fail within 6 months of that loss. And if you lose those family photographs you may be in big trouble!
There are a number of elements we need to consider when discussing backing up:
The data must be secure. You probably wouldn’t want your medical or financial records stored somewhere in cyberspace where only your password stands between you and the criminal element.
The backup program must be easy to use. You may not do a backup if the backup program adds significantly to your workload.
The backup program must be quick to do a backup. You aren’t likely to spend hours backing up data you may never need to recover, and human nature being what is we all know a drive crash will never happen to us.
The backup program must be quick to restore. Once data is lost we want to recover and get back to work as soon as possible.
The backup program must be affordable. For many this is the first criteria considered, although in my opinion it should be the last.
It is important to distinguish between backing up data and backing up the entire computer. A personal computer or a laptop with a small drive can be completely backed up—data, programs and all—and will usually restore to something very close to a working machine again, although this is really a job for drive imaging software. With a more complex computer it’s usually faster to reinstall the operating system and programs and then restore only the data. On a server-based network, users should be discouraged from having any local data. This approach makes it simple to just “drop in” a new computer at any desk whether as an upgrade, a replacement, or just a loan during repairs.
The following discussion applies to either a stand-alone computer or to a server-based network excluding the process of backing up the local workstation.
The most convenient backup system would have duplicate copies of every file and program immediately accessible whenever data is lost. On a domain-based network it is possible to achieve this protection by using Volume Shadowing and version tracking. Windows 7 offers some of this protection but it’s not particularly convenient or easy. Earlier versions of Windows are even more challenging.
The simplest of solutions is to use an online storage site such as Mozy or iDrive. There are many such sites, some free and some not, but they all suffer from the same problems: lack of speed and (potentially) lack of security. All online sites are inherently slow to access because the user’s upload speed is usually quite slow compared to the download speed. In addition, it’s seldom possible to be 100% sure that the online site data is kept completely secure.
A second solution is to use another hard drive, either permanent or portable. This method is much faster and security is not such an issue, but there is always the risk of mechanical failure or physical loss (fire, theft, etc.). Optical media (CD, DVD) are somewhat less vulnerable to damage but there are problems writing to optical media and they don’t hold much data.
A third solution is to use a networked drive, but this presumes you have a network and still doesn’t address the issue of mechanical failure or physical loss.
Finally, there is the solution taken (I fear) by the majority of users, which is to do nothing and hope for the best. This is no solution at all, but a guarantee of unhappiness somewhere down the road!
The method I use for my own network, and for the networks I have maintained, is a combination of these solutions. In brief, I make more than one backup and each backup is successively more difficult to access. My first line of defence is to back up to a local, dedicated hard drive. This backup is done nightly using automated backup software such as GRBackPro or even <shudder> Windows backup (but there are a lot of problems with using this free software). This first backup ensures I have the day’s work protected against drive failure or user error; if the user happens to erase the wrong file the backup is no more than 24 hours old. The chance of both hard drives failing at the same time is small, although not zero, but there is no protection against physical damage or theft.
Later in the night (I do this every night but you might do it less often depending on your level of paranoia) I back up the dedicated drive to another location on the network, usually to a computer which does nothing else but store backups. This now gives a second level of protection, and I normally use a password on this backup as the computer may be somewhere else in the building (by preference in the Server room, since it will usually provide a UPS, air conditioning, and security).
Finally, I backup all the second level backups onto a removable hard drive (I prefer to use a third computer for this task but the drive could be in the backup computer) and then I take the removable drive off-site. You might be able to do a data exchange/storage service with a friend, or a local business with which you have a good relationship; in a pinch even your local bank vault will work. This offsite backup now ensures that you have your data stored in a separate location so that you don’t lose all everything in case of disaster. Since it’s a backup of a passworded backup, it’s reasonably secure even if it does happen to wander into someone else’s hands.
Physically exchanging the removable hard drive is the only part of this operation that can’t be automated and if it isn’t done on a rigid schedule it’s not terribly serious.
The end result, then, is that you have an original data source, a very easily accessed local backup, a less easily accessed and slightly older nearby backup, and a very secure, somewhat hard to access and even older backup. The chance of all these backups being lost at the same time is very small indeed.
There are an almost infinite number of programs which purport to be the answer to your backup needs. I’ve tried many of them, and have settled on GRBackPro as the software which does what I want. Part of the choice is cost, of course, but it’s important to consider whether it’s possible to get assistance with the program if it becomes necessary. One of the reasons I recommend GRSoftware is that they have been very responsive over the years. It’s worth noting here that some software uses a proprietary method of data storage. With Windows Backup, for example, is not only is the method proprietary but data backed up under one version of the software often cannot be restored by another version, even a newer one! GRBackPro uses the standard, familiar ZIP™ file format, which if necessary can be restored directly from Windows. If you have a favourite program, by all means use it—anything is better than nothing. If you’re hunting for something to use, I suggest you take a look at GRBackPro.
It takes a little while to get everything set up for this multi-level backup solution. It takes some planning, and you may have to buy some hardware. Fortunately, very large hard drives are now quite inexpensive; the cost is small compared to the cost of data loss. With a little thought and a little money you can set up a reliable backup solution that will let you recover quickly and easily from even the worst disaster.
Dr. Hinton is a retired Network Technician and noted Educator. Trained by Nortel, he instructed local teachers in the field of Network Management and also taught Computer Science and Career Preparation at secondary school. In addition he spent many years as owner of Associated Computer Technologies, managing and maintaining networks in business offices and for organizations such as the Down Syndrome Research Foundation in Burnaby, BC.