Shell scripts to backup users data to network volume.

BGoldman
New Contributor

Hi,

We are about to embark in a huge OS upgrade for my corporation. We are looking for a low cost solution to backup all of the users home folders to a network share possibly using shell script commands. . We would like to use self service to initiate the process. Could anyone provide me any information and possible script examples that we could incorporate it to our migration plan? I would appreciate any feedback.

Thanks,
Brad Goldman

5 REPLIES 5

davidacland
Honored Contributor II
Honored Contributor II

I would suggest rsync for this purpose. We've been using it for years to do this type of thing. If you are sending all the data to a single location just for backup purposes the command will look something like:

rsync -uhzrlv "/Users/$USER" /Volumes/networkdrive/

A few other things I would recommend adding:

You can reference an exclusion file to specify certain files or folders to skip. The file contents would look like:

# files and folders to exclude
- .Trash
- .DS_Store
- Caches
- *com.apple.LaunchServices

You can then include this file in your command with:

rsync -uhzrlv --filter='merge /path/to/exclusion/file' "/Users/$USER" /Volumes/networkdrive/

You can also add some logging if you add --progress in the options and "2>> /Users/Shared/${USER}syncerror" at the end. This will capture errors from the rsync process and save them to a text file named with the users name. The full command would look like:

rsync -uhzrlv --progress --filter='merge /path/to/exclusion/file' "/Users/$USER" /Volumes/networkdrive/ 2>> /Users/Shared/${USER}syncerror

You could of course put the log file wherever you like.

I've used $USER in this example but you might want to use a better method to set this variable at the start: https://macmule.com/2014/11/19/how-to-get-the-currently-logged-in-user-in-a-more-apple-approved-way/

Hope this helps.

BGoldman
New Contributor

Davidacland,

Thanks for the information. I will look into it, hopefully this will be the answer. Of course i will test several times over before I implement. :)

dpertschi
Valued Contributor

@Goldman
If your meaning of 'huge' is dozens or hundreds of machines, do you have a ridiculous amount of free space on the network volumes?

I ask because I'm planning something similar this year, up to 500 machines, and as I audit I'm finding quite a lot of users with local home folders well over 100GB. (the server and network guys are going to hate me forever).

Is storage and bandwidth not an issue for you, or are your users actually trained well enough to keep most of their stuff on the server anyway?

BGoldman
New Contributor

dpetschi,

We have over 800 macs in our environment. Bandwidth should not be a problem as we mostly do migrations after hours. I believe we have a server that has lots of space that might be able to handle the load. Most of our clients keep all of there data on there desktops of there machines, even though they have been told the local drives don't get backed up.

daz_wallace
Contributor III

I would suggest firstly (if you haven't already) build an EA to gather the user folder size for each Mac:

#!/bin/bash
    usersize=$( du -h -d 0 /Users | awk '{print $1}' )
    echo "<result>$usersize</result>"
exit 0

You can then output the information into a CSV and get a realistic size of the folder and how much data you'll be dealing with.

Also Rsync is great as it'll only copy the changed files.

I spent a few days last month implementing this with a client of ~150 Macs. The policy runs once a day in the early hours of the morning (staggered by an hour per ~50 odd Macs). This has been left to run daily and is keeping up well!

Darren