Tuesday, July 8, 2008

Local Files > Remote tar.gz with GnuTar and SSH

How to go from local files to remote compressed tarbal with no intermediate temp files in one command with tar and ssh.

So I'm getting ready to do a fresh upgrade/install to Ubuntu 8.04, the Hardy Heron, and I decide to back up my home folder to my file server. A quick ssh server df -h tells me that I have 443GB left on a 2.3TB array. It's barely enough, I need more HDs.

Ok, so I need to tar and feather my home before sending it. df -h on my computer shows that I'm using 85% of my /home patition. So there's probably not enough room to tar it localy and transfer to my server. I could just transfer all the files and folders to the server without taring, but hey, that's no fun :)

The problem is that although rsync and scp both support compression during the transfer, everything comes out just like it went in. I did some googling to see if I could just pipe tar to rsync or scp, but didn't find anything helpful. However, I was reminded that you can pipe output from a local program to a remote program using ssh a la echo "Hello world" | ssh anotherhost.com 'cat > /tmp/1' (taken from here)

So all we need to do now is get tar to pipe to ssh which then cats to a file on the remote server. Here's the basic process:
  1. Tar the files, output to stdout:
    tar cjv /home

  2. Use ssh to connect to remote server:
    ssh remotehost

  3. Write the stream to a file with timestamp:
    cat > home-`date +%Y.%m.%d-%H:%M:%S`.tar.gz

If we put all that together, we get the following:

tar cjv /home | ssh remotehost 'cat > home-`date +%Y.%m.%d-%H:%M:%S`.tar.gz'

I hope this helps anybody unfortunate enough to stumble upon this blog.