Backup onto a remote computer

Ok, so you have backups on your local harddrive, for example via Snapshots using rsync and hardlinks or Backup using tar. But wouldn't it be nice to now store some of these also on a different computer. After googling a bit, this is how I copy files to the remote computer:

You need to:

  • be able to setup a new user
  • be able to edit sshd_config

On the remote computer

Most setup is done on the remote computer. First we add a new user, say we call him "backupuser". We then create a ssh-key so that we can log in without a password for this user.

 su backupuser

 ssh-keygen

The idea is that we copy the public key to our computer and can then log in automatically and do a rsync backup via a cron script. The situation that a user can login without a password is not ideal, so we want to restrict what the user can do. For this we create the following script and call it "validate-rsync":

#!/bin/sh

case "$SSH_ORIGINAL_COMMAND" in
*\&*)
echo "Rejected"
;;
*\(*)
echo "Rejected"
;;
*\{*)
echo "Rejected"
;;
*\;*)
echo "Rejected"
;;
*\<*)
echo "Rejected"
;;
*\`*)
echo "Rejected"
;;
*backupuser@server:/backup)
echo "Rejected"
;;
rsync\ --server*)
echo $SSH_ORIGINAL_COMMAND >> ~/rsync.log
$SSH_ORIGINAL_COMMAND
;;
*)
echo "Rejected"
;;
esac

This should allow only "rsync --server" to be executed which makes things a lot saver. Now we have to force this script to be executed whenever a user logs in. We do this by adding the following to the sshd config file:

match User backupuser
  ForceCommand /home/backupuser/bin/validate-rsync
  X11Forwarding no
  AllowTcpForwarding no

On the client

copy the ssh-pub-key to location A and then create a script similar to this one:

#!/bin/bash

# paths and progs I use
DEST=backupuser@server:/backup/
ORIG=/backup/local/weekly.0
RSYNC=/usr/bin/rsync

#don't use anything else
unset PATH

# delete oldest snapshot
$RSYNC -avz --delete-after -e "ssh -i /path/to/keyfile" $ORIG $DEST

this will upload the backup on server

to rotate the backups use the following via cron:

#!/bin/bash

# paths and progs I use
DEST=/backup/
RM=/bin/rm
MV=/bin/mv
CP=/bin/cp
TOUCH=/usr/bin/touch

#don't use anything else
unset PATH

# delete oldest snapshot
if [ -d $DEST/weekly.3 ] ; then          \
$RM -rf $DEST/weekly.3 ;         \
fi ;

# rotate other snapshots
if [ -d $DEST/weekly.2 ] ; then          \
$MV $DEST/weekly.2 $DEST/weekly.3 ;       \
fi;

if [ -d $DEST/weekly.1 ] ; then          \
$MV $DEST/weekly.1 $DEST/weekly.2 ;       \
fi;

if [ -d $DEST/weekly.0 ] ; then          \
$CP -al $DEST/weekly.0 $DEST/weekly.1 ;       \
fi;

Snapshots using rsync and hardlinks

At the moment I'm using rsync and hardlinks. This allows you to take a lot of snapshots without needing too much storage... on the other hand, since hardlinks are used, you really only have one backup, e.g. if the disk fails all versions of that file are gone. So this is not really a backup solution.

What are hardlinks? If you make a hardlink copy of a file, you don't actually copy the content, you just create another pointer to the same area on your harddrive. So say you have a file "a" and make a hardlink copy of it called "b" if you now delete "a", you can still access the file "b". The actual file won't be deleted or overwritten until you delete all the hardlinks to it. Also if you change either "a" or "b" the other file will change too, because on your hard disc there is only one file... why is this usefull? because it makes creating copies of files that have not changed really cheap regarding hard disc space and also the time that is needed for copying.

Rsync is a program that enables you to make a copy of two directories, but it can be set up, so that it will make only copies of files that actually have changed. You can also tell it to compare the files to an already existing 3rd directory and create hardlinks to files that exist there.

With the above it is fairly easy to set a script up that will create a backup every N minutes/hours/days/weeks/month, so that you not only have the last backup at hand, but also M backups before that. So say you do backups every hour, you will have a copy of all files how the looked an hour ago, but also how the looked two hours, 3, 4, 5, 6, ..., M hours ago. And recovering a file just takes a copy command.

So here is the script to set up an hourly backup, keeping the last 3 backups before that too:

#!/bin/bash

# paths and progs I use
DEST=/path/to/where/the/backup/should/go
ORIG=/dir/that/should/be/backed/up
RM=/bin/rm
MV=/bin/mv
RSYNC=/usr/bin/rsync
TOUCH=/usr/bin/touch

#don't use anything else
unset PATH

# delete oldest snapshot
if [ -d $DEST/hourly.3 ] ; then         \
$RM -rf $DEST/hourly.3 ;                \
fi ;

# rotate other snapshots
if [ -d $DEST/hourly.2 ] ; then         \
$MV $DEST/hourly.2 $DEST/hourly.3 ;     \
fi;

if [ -d $DEST/hourly.1 ] ; then         \
$MV $DEST/hourly.1 $DEST/hourly.2 ;     \
fi;

if [ -d $DEST/hourly.0 ] ; then         \
$MV $DEST/hourly.0 $DEST/hourly.1 ;     \
fi;

# create new snapshot, use hard links to hourly.1 if possible
if [ -d $DEST/hourly.1 ] ; then         \
$RSYNC -a -v --numeric-ids --delete --link-dest=$DEST/hourly.1 $ORIG $DEST/hourly.0; \
else \
$RSYNC -a -v --numeric-ids --delete $ORIG $DEST/hourly.0; \
fi;

# update time stamp
$TOUCH $DEST/hourly.0

The magic is in the rsync command, it will backup all files, but only copy those that are not already in the DEST directory and in case files are the same as in DEST/hourly.1 it will just use hardlinks. This means only files that changed are actually copied each backup.

Once you have this. It is easy to also do daily backups for the last N days:

#!/bin/bash

# paths and progs I use
DEST=/path/to/backup/dir
RM=/bin/rm
MV=/bin/mv
CP=/bin/cp
TOUCH=/usr/bin/touch

#don't use anything else
unset PATH

# delete oldest snapshot
if [ -d $DEST/daily.3 ] ; then          \
$RM -rf $DEST/daily.3 ;         \
fi ;

# rotate other snapshots
if [ -d $DEST/daily.2 ] ; then          \
$MV $DEST/daily.2 $DEST/daily.3 ;       \
fi;

if [ -d $DEST/daily.1 ] ; then          \
$MV $DEST/daily.1 $DEST/daily.2 ;       \
fi;

if [ -d $DEST/daily.0 ] ; then          \
$MV $DEST/daily.0 $DEST/daily.1 ;       \
fi;

if [ -d $DEST/hourly.3 ] ; then         \
$CP -al $DEST/hourly.3 $DEST/daily.0 ;  \
fi;

It just uses the backups in hourly and once a day makes a hardlink copy of one of them.

Now all you need to do is set up a cronjob and get those scripts running at the correct time:

13 */4 * * * /path/to/bin/make_hourly_web_backup >/dev/null
15 13  * * * /path/to/bin/make_daily_web_backup >/dev/null
27 2   * * 0 /path/to/bin/make_weekly_web_backup >/dev/null

Buttermilk pancakes

Makes 14 to 16 three-inch pancakes.

amount ingredient
1 cup flour
1 tbsp. sugar
1 tsp. baking powder
1/2 tsp. baking soda
1/4 tsp. salt
1 egg
1 cup buttermilk (can add some milk depending on buttermilk)
2 tbsp. butter, melted
1/2 tsp. vanilla (optional)

Mix dry ingredients.

In another bowl mix buttermilk, butter and egg.

Mix both together, don't stir too long.

Cook in a bit of oil. Turn over when bubbles begin to break on the surface.

Mushroom bechamel

for 2:

amount ingredient
1/2 union
12 small mushrooms
250 ml milk
1 tablespoon butter
1 tablespoon flour
  nutmeg, salt and pepper

Cut union and mushrooms, fry unions, add mushrooms, fry for a short time, add milk, let it boil.

In another pot, melt the butter, add flour, mix (don't use too much heat, shouldn't turn brown). Add milk-mushroom mixture, let boil.

Add nutmeg, salt and pepper.

Pasta dough

Pasta for 4-5:

amount   ingredient
1 cup 180g semolina
1 cup 146g flour
1 teaspoon   salt
3 teaspoon   oil
145 ml   water

Also works without salt and oil.

Mix salt with semolina, add oil and water, mix well, add flower until you get a nice dough. Let rest for roughly 1h. role out, cut into pieces and dry the pasta for 30 minutes. cook for a few minutes in salt-water.

Makes ~480 g of pasta dough, roughly 48 ravioli + 90 g leftover (used to make e.g. fettuccine: long, thin pasta).

Chocolate cookies

amount   ingredient
1 cup 226 grams unsalted butter, room temperature
1 cup 200 grams granulated white sugar
1 cup 215 grams light brown sugar
2 large   eggs
2 teaspoons   pure vanilla extract
3 cups 420 grams all-purpose flour
1 teaspoon   baking soda
1/4 teaspoon   salt
1 1/2 cups 270 grams semisweet chocolate chips

Preheat oven to 350 ℉ (177 ℃) with rack in center of oven. Line two baking sheets with parchment paper. Set aside.

In the bowl of your electric mixer (or with a hand mixer), cream the butter. Add the white and brown sugars and beat until fluffy (about 2 minutes). Beat in eggs, one at a time, making sure to beat well after each addition. Add the vanilla and beat until incorporated.

In a separate bowl, combine flour, baking soda, and salt. Add the dry ingredients to the egg mixture and beat until incorporated, adding the chocolate chips about half way through mixing. If you find the dough very soft, cover and refrigerate until firm (about 30 minutes).

For large cookies, use about a 2 tablespoon ice cream scoop or with two spoons, drop about 2 tablespoons of dough (35 grams) onto the prepared baking sheets. Bake about 12 - 14 minutes, or until golden brown around the edges. Cool completely on wire rack.

Makes about 4 dozen - 3 inch round cookies

Note: You can freeze this dough. Form the dough into balls and place on a parchment lined baking sheet. Freeze and then place the balls of dough in a plastic bag, seal, and freeze. When baking, simply place the frozen balls of dough on a baking sheet and bake as directed - may have to increase baking time a few minutes.

Bruscheta

amount ingredient
several tomatoes
2 cloves garlic
1 Tbsp olive oil
1 teaspoon balsamic vinegar
  salt and pepper
1 baguette

Cut garlic and Tomates to small pieces, mix add oil, vinegar and salt and pepper.

Cut baguette and add olive oil on one side. Heat oven to 200 ℃ and add bread (olive oil side upside down) on a baking sheet, leave in the oven until the bread is crisp (1-5 minutes).

Add 1-2 Teaspoons of mixture onto each slice of bread.

Filled mushrooms

amount: for two

amount ingredient
10 big mushrooms
1 small can tomato puree
2 cloves of garlic
1 can cream
200 gramm grated cheese
  salt and pepper

Clean mushrooms, remove stems. Cut stems and garlic very fine, add tomato puree and mix with salt and pepper. Fill mushrooms with mixture and put them in a backing form. Add cream and cheese and bake for roughly 20 min. at 200 ℃.

Kniting a scarf

start 2006-12-25
end 2007-03-04

How to start - the long tail cast on

To start you need to get stitches onto one of the needles, I used 60 stitches for the scarf. Here is how you do it:

Knitting start02
Knitting start05
Knitting start03
Knitting start04
Knitting start07
Knitting start06
Knitting start09
Knitting start08
Knitting start01

Holding the needles

Grip4
Grip1
Grip3
Grip2

How to do a purl

Knitting purl2
Knitting purl3
Knitting purl1
Knitting purl5
Knitting purl4

How to do a knit

Knitting knit3
Knitting knit4
Knitting knit1
Knitting knit2

A simple pattern

For the scarf I used a simple pattern: For the very first stitch I used a purl, for the rest I did one row of purls and for the next row knits.

Adding tassels

The problem is that you need to cut a lot of wool to the same length. Funny enough the width of a DVD case is just about the length you want… so here we go

Tassel05
Tassel04
Tassel02
Tassel01
Tassel03

Backup using tar

Gnu tar has the nice feature of creating incremental backups. When you use incremental backups, tar creates the normal tar-file and at the same time it will also generate a special file containing a list of files, modification date, etc. If you make now a second incremental backup only files that changed regarding to the list tar made will be saved. To recover a file you need all the incremental backups and the complete first backup, since you don't know where the file will be.

Here is a sample script I use:

# backup some important files
# save them on the second hard-drive
#
# written by Arun Persaud
#    2001-10-10

MAILTO=""
export MAILTO

KEEP_DAYS=10    #days
KEEP_WEEKS=33   #days
KEEP_MONTHS=100 #days

# find and delete files that are older than xyz
find /backup/log/     -type f -mtime +$KEEP_DAYS   -exec rm "{}" \;
find /backup/daily/   -type f -mtime +$KEEP_DAYS   -exec rm "{}" \;
find /backup/weekly/  -type f -mtime +$KEEP_WEEKS  -exec rm "{}" \;
find /backup/monthly/ -type f -mtime +$KEEP_MONTHS -exec rm "{}" \;


# get the Date
DATE=`date +%Y-%m-%d` # complete date
WEEKDAY=`date +%w`    # day of the week, sunday=0
DAY=`date +%d`        # day of month, 01-31
WEEK=`date +%U`       # week 00-53
MONTH=`date +%m`      # mont 01..12

if [ "$DAY" = "01" ] ; then
   DIR=monthly
   WEEK=m`date +%U`    # dump level-0
elif [ "$WEEKDAY" = "0" ] ; then
   DIR=weekly
else
   DIR=daily
fi

# do backup
tar --create \
    --file=/backup/$DIR/backup_$DATE.tgz \
    --listed-incremental=/backup/log/backup_$WEEK.snar \
    --verbose \
    --exclude=dirs-that-should-not-be-included \
    --gzip /dirs/that/should/be/included
chmod 600 /backup/$DIR/backup_$DATE.tgz

The above script can be called directly from crontab.