Tech Support Guy banner
Status
Not open for further replies.
1 - 4 of 4 Posts

·
Registered
Joined
·
215 Posts
Discussion Starter · #1 ·
The web server is old and slow. It performs wonderfully on a day-to-day basis. But sometimes it chokes and times out during a large backup.

The server operates several virtual web sites. Each has several sub-directories, some of which need to be backed up nightly. Other subdirectories are large and cumbersome - and they do not require backups. Each site has
/web -- must be backed up
/users -- must be backed up
/logs -- large, no backup needed.
plus some other subdirectories that don't need to be backed up.

To make backups work more smoothly, I separated the old one process - one tar nightly backup into separate backups for each virtual site. Now I backup to create
site1.tar
site2.tar
site3.tar
and so on....

That worked for a while. But as the sites grow, the files are bigger and the tars are more likely to fail. The log files are a large part of the growth problem.

I'm going to change the backup procedure (again) to eliminate the backup of the "logs" directories.

Is there a way to express the tar command so that I can create one backup for each site, but include only 2 of the subdirectories (or exclude 1 of the subdirectories) for each site?

The current command for each site's backup is
tar -xvf site1.tar /home/site1

I'm thinking of something like tar -xvf site1.tar /home/site1/web,/home/site1/users.

I could do two backups for each site:
tar -xvf site1.tar /home/site1/web
tar -xvf site1.tar /home/site1/users

That would just double the fun. :down: :down:

There is no money for a new webserver. The frequency of backups and the retention policies are not negotiable.

Like the people working in public schools, this public school system's web server is overworked and underfunded. I use scripts and multiple archive directories to help manage this load, but I still have to inspect the newest TARs every day to reduce the chances of a foul-up.

15 backups (15 sites) nightly is complicated enough; 30 backups every night might push me over the edge. :eek: :eek:

I'm open to suggestions. Thanks in advance.
 

·
Retired Trusted Advisor
Joined
·
7,154 Posts
Looks like tar DOES (or can) have an exclude option. I would read the man page on tar on your system and see if it supports the "--exclude" option.

EDIT: Your 'tar -xvf site1.tar /home/site1/web /home/site1/users' command should also work, I believe, but I think you can omit the comma between the paths you want to tar.

Peace...
 

·
Registered
Joined
·
215 Posts
Discussion Starter · #3 ·
Looks like tar DOES (or can) have an exclude option. I would read the man page on tar on your system and see if it supports the "--exclude" option.

EDIT: Your 'tar -xvf site1.tar /home/site1/web /home/site1/users' command should also work, I believe, but I think you can omit the comma between the paths you want to tar.

Peace...
Before posting, I tried several variants each of
tar -xvf site1.tar dir1 dir2
and
tar -xvf site1-exclude.tar /home/site1/ -exclude="/logs"
to no avail. I assumed that this ancient version of tar wasn't as advertised (in MAN)

When reading your answer, I noticed that
tar -xvf
was a command to extract from a tar, not to create a new one.

When I changed to
tar -cvf site1.tar dir1 dir2
and
tar -cvf site1-exclude.tar /home/site1/ -exclude="/logs"
it worked. Had I written the commands right the first time, I wouldn't have needed to post. :eek: :eek: :eek:

It was easier to see the mistake in your post than in my post and in my commands. Without your help, I would have been tearing my hair out for another day or 2. THX
 
1 - 4 of 4 Posts
Status
Not open for further replies.
Top