Forums

Resolved
0 votes
Hello all,
I just rent a sinology to an external copy of my data . The idea is to install on it a backuppc server and the connect to my local system to retrieve datas. For that, I'd like to connect using rsync over ssh. But do to that, I need to open my port 22 on my firewall and I'm not very enthusiastic with that. So, I'd like to open it but only for a specific IP adress (the public one of my sinology). Is there a way to do that ?
And, second question, would it be better to open another port that the 22 and the make a redirection to my ssh server ?
thanks to all for your help :)
Arnaud
Tuesday, April 24 2018, 01:41 PM
Share this post:

Accepted Answer

Wednesday, April 25 2018, 01:35 PM - #Permalink
Resolved
0 votes
I've just done a test:
[root@7 ~]# mkdir temp1
[root@7 ~]# echo "data" > temp1/testfile
[root@7 ~]# cat temp1/testfile
data
[root@7 ~]# mkdir temp2
[root@7 ~]# cp -l temp1/testfile temp2/testfile2
[root@7 ~]# cat temp2/testfile2
data
[root@7 ~]# rm -f temp1/testfile
[root@7 ~]# cat temp1/testfile
cat: temp1/testfile: No such file or directory
[root@7 ~]# cat temp2/testfile2
data
In plain speak:
Create a temp folder, temp1
In the folder create a file, testfile containing the text "data"
Check the file contents
Create another temp folder temp2
Copy temp1/testfile to temp2/testfile2 as a link
Check the contents of temp2/testfile2 - it is the same as temp1/testfile.
Delete temp1/testfile
Check it is not there
Check the contents of temp2/testfile2 - it still contains "data".

This means you should be able to delete the oldest folder and any current files which exist in the next latest folder will effectively move to the next latest folder.

This looks great, but please test for yourself. It means that after the first full copy you only need to do incremental copies and you can delete backups as you want from the oldest to the newest but one and still be left with a complete backup. There is no need to do a merge or any more full backups.
The reply is currently minimized Show
Responses (14)
  • Accepted Answer

    Friday, April 27 2018, 07:14 AM - #Permalink
    Resolved
    0 votes
    Hello all,
    I made a first test with hardlinks and this worked great !! Thanks to all :)
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 01:44 PM - #Permalink
    Resolved
    0 votes
    Thanks very much Nick, it's look like great :)
    In the meantime, I found that post with a script which does exactly that and remove the oldest backups :) looks great, too ::)¨
    I'll make a test :)

    https://gist.github.com/morhekil/8382294
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 01:36 PM - #Permalink
    Resolved
    0 votes
    Arnaud Forster wrote:
    Does that mean that the file is stored once and till all hardlinks are not deleted, the file will be there, even if the first full backup is deleted ?
    Thanks :)
    It seems so!
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 01:19 PM - #Permalink
    Resolved
    0 votes
    I probably didn't understand very well the complete behaviour of a hardlink because I found a post where it is said (as Dave said) that :

    "Most of the time when I hear full backups and incremental backup people mean:
    Full: Backup all the data.
    Incremental: Backup only the changes.

    If you need to restore a backup then you start with the full backup and then all the incrementals. That can take a lot of time. This is one reason why many corporations do full backups in the weekend and partial ones during weekdays. Up to five partials is manageable.

    Now rsync usually does not make partial backups. If sends only the changes over the net, but the end result is a full copy of all the data. Thus the most used reason for to not use only partials does not apply."

    Does that mean that the file is stored once and till all hardlinks are not deleted, the file will be there, even if the first full backup is deleted ?
    Thanks :)
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 12:37 PM - #Permalink
    Resolved
    0 votes
    Hello Nick,
    I agree with you, I can't keep incrementing forever ; that's why I'd like to be able, once a week, to 'merge' the last full backup and the incremental ones to make a new 'full' backup. According to the hardlink function, I found that post which seems interesting : (http://speakmy.name/2014/01/12/automatic-backups-with-ruby-and-linux-shell-part-3/)
    As the amount of data I've to backup is about 900GB, I can't send that through internet every week ! :)

    @Tony : I'll check that part of having at both site a passphrase, thanks :)
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 10:22 AM - #Permalink
    Resolved
    0 votes
    The link of Dave's is interesting and so are some of the follow-up posts (like not needing the "cp -al") and so on. If scripting you'd need to do a baseline perhaps on day 1 then 6 days of increments and rotate that somehow. You can't keep incrementing forever - at least you can, but it probably does not make much sense.

    If using the /etc/cron.daily method to run daily, make sure you launch rsync into the background with a trailing "&" or it can end up blocking the following jobs. I don't know about crontab. It may be OK.

    I've tried reading up a bit about deleting the files and I have not got my head round it. It may work quite nicely. For example, if you delete the main backup, I think the hardlink files then persist in the next incremental but I am not sure. Can you confirm?
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 09:39 AM - #Permalink
    Resolved
    0 votes
    For backups use rsync over ssh here... ssh is setup to use a passphrase and login between systems is automated using keychain and public/private keys. It's not a good idea to use ssh with no password or pass-phrase... IMHO easier then messing with certificates...
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 07:53 AM - #Permalink
    Resolved
    0 votes
    ok, so if I understand right, I should do that ?

    1st backup (COS --> Sinology Folder #1) : full backup without hardlink

    2nd backup : Sinology folder #1 --> Sinology folder #2 with ardlink
    Cos --> Sinology #2 with hardlink & delete

    3rd backup : Sinology folder #2 --> Sinology folder #3 with ardlink
    Cos --> Sinology #3 with hardlink & delete

    and so ...
    ?
    Thanks :)
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 07:07 AM - #Permalink
    Resolved
    0 votes
    Thanks Nick,
    I saw a post with rsync over ssh without password. I've to create certificates on both sides and exchange them. I'll try this way because if it works, I'll try to backup different COS systems :)
    But I need first to understand
    Dave's post :)
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 06:50 AM - #Permalink
    Resolved
    0 votes
    From memory, one issue with running rsync over ssh was that it was not easy to automate a login. You certainly can't put the password in the rsync command, but you may be able to work with ssh keys instead. For my remote backup (to a Pi), I have the Pi connect to me by OpenVPN and then run rsync as a daemon on the Pi. When run as a deamon it does not use ssh at all. I use this internally for speed (no encryption/decryption overhead) and both internally and externally for automation of the backups.
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 06:45 AM - #Permalink
    Resolved
    0 votes
    Hello Dave,
    Thanks very much for the informations :) Yes, i just realized that I can't install backuppc on the DSM I just rent in the cloud ; so I'm going to make my backups using rsync over ssh. I'll check the link you sent and try to make different rsync with crontab :)
    Thanks very much :)
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 06:07 AM - #Permalink
    Resolved
    0 votes
    BackupPC is a great program for doing incremental/differential backups just as you describe.

    That being said, you can still do data deduplication with rsync provided that your volume that you are backing up to supports hardlinks. So you can do this on EXT3/4 or XFS quite easily.

    Simply make a copy using hard links of your previous week's dataset to a new directory named this week and then instead of rsyncing to the previous week, rsync with '--delete' to this week's set. The '--delete' will remove files from the folder for this week, effectively unbinding the connection to the other folder's object and then any changed files will get updated via deltas effectively splitting that file from the previous copy as well. The two key benefits here are uber fast backups and optimized space through deduplication provided by hard links.

    Each folder will look and act like a full backup. When you 'stat' the file in a directory, you'll see how many times the hardlink exists.

    Here is an overview of the process/technology.
    The reply is currently minimized Show
  • Accepted Answer

    Wednesday, April 25 2018, 05:22 AM - #Permalink
    Resolved
    0 votes
    Hello Nick,
    Thanks so much, I'll try that :)
    I thought running the rsync from the remote backuppc system on the synology. But now, I wonder if it's very usefull... maybe a single script would be enough. The only problem thing is that I wouldn't like to make a full backup every week because it will take a too long time. So I was wondering if Backuppc is able to 'make' a new full backup with the one from the week before plus the incremental ones of the week. (I dont know if what I said is clear... ) :)
    Thanks very very much :)
    The reply is currently minimized Show
  • Accepted Answer

    Tuesday, April 24 2018, 02:14 PM - #Permalink
    Resolved
    0 votes
    To open an incoming port, from the command line use:
    iptables -I INPUT -s your_remote_IP -p tcp --dport 22 -j ACCEPT
    If it works, change "iptables" to "$IPTABLES" and put it in the custom firewall.

    Which device is going to run the rsync? If it is ClearOS then you don't need to open its ports, just the other end.

    It would be safer to use a different port if you're opening the port to more than one IP, but if you're opening it to just one IP then I can't see it will make any difference.
    The reply is currently minimized Show
Your Reply