Stichword-Archiv: bash

Things I always search for when writing a new BASH script

November 25, 2020 8:16 am Veröffentlicht von

Get the directory of the script

When writing a bash script, especially when it is distributed via a git repository and depends on other files in the same repository, it is often important to know the location of the script to use relative paths to other files. Particularly, getting the directory of the current script allows users to execute the script from wherever they want and the script can still depend on files similar to using a relative path. This one-liner is what I typically do to get the location of the current script, found on Stack Overflow.

DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"

RaspberryPi NAS

Juli 27, 2018 7:47 am Veröffentlicht von

Guest Series at opensource.com

After my recent first guest article on https://opensource.com the first part of a series about creating a Raspberry Pi based NAS has just been published. You can read it hier kontaktieren. There has been a surprising amount of twitter reactions to the article and it seems it also collected more than a thousand visits on the first day.

Building a network attached storage device with a #RaspberryPi https://t.co/PrNwRsZnjr via @opensourceway by @manueldewald #RPi #OpenSource #FOSS

— Juha Remes (@jremes84) 24. Juli 2018

Update: Part II has ben published yesterday, with a similar reaction as the first one. Seems the topic of creating a NAS using a Raspberry Pi for private usage is quite popular. Also, thanks to Jen for editing my articles to make the text more readable.

Automating backups on a Raspberry Pi NAS https://t.co/mT87JYvVYT via @opensourceway by @manueldewald

— Jen Wike Huger (@JenWike) 14. August 2018

Update 19.09.18: the third and final path of the Raspberry Pi NAS has been published. Already in the first 24 hours a big number of tweets and retweets shows that there is some interest in this topic. This part explains how to install Nextcloud on a Raspberry Pi.

Thanks to the opensource.com team for encouraging me to start writing guest posts! It’s a nice experience to work with you.

Use Ansible to clone & update private git repositories via ssh

Juli 7, 2018 7:21 am Veröffentlicht von

One of the first things I wanted to do when I started using Ansible was to clone a git repository on a remote machine as I keep configuration, scripts, and source code in github or gitlab repositories. Things that are not meant for the public, I store in private repositories that I want to clone via ssh. Cloning and updating them I now want to automate with Ansible.

There are different ways to go for this task:

  • Checkout the repo locally and copy it to the server via a Ansible synchronize task
  • Generate an ssh key on the server and allow cloning the repo with that key manually
  • Copy a local ssh key to the server and allow cloning the repo with that key
  • use ssh-agent to load the local key and forward the agent to the server
While it might be tempting to just copy an ssh key via Ansible to the remote server, I find this quite risky,  as it means you copy a secret to a persistent storage on a remote server. Also, if you version your Ansible playbooks in a git repository as well to be able to execute the playbook from somewhere else, the private key has to be versioned along with it.

Using ssh-agent, you can easily load your ssh key prior to provisioning the git repo on the remote server without copying it over, and without allowing access to your repo for a different key than the one you have granted access for development.
Let’s go through this via a simple example. Let’s say you want to run the following playbook, which includes ensuring the git repository github.com/ntlx/my-private-repo is up-to-date.

1
2
3
4
5
6
7
---
- hosts: webserver
  tasks:
      - name: Ensure repo is up-to-date
        git:
            repo: git@github.com/ntlx/my-private-repo.git
            dest: repos/my-private-repo
I assume you added your public ssh key to your github.com repository so you are able to clone and work on the repository locally. To clone the repository on the remote machine, you need to load your ssh-key to ssh-agent with the following command.

ssh-add ~/.ssh/id_rsa

Now we need to enable the forwarding of the ssh agent to the remote machine so we can access the loaded key remotely. There are different ways to do so, but I find it most useful to do it in your ansible.cfg like this:

1
2
[ssh_connection]
ssh_args=-o ForwardAgent=yes

That way, you allow the forwarding for all your Ansible-managed hosts at once.

Now you can go on executing your playbook and should be able to clone the repository on the remote host.

To make it even easier, we can add a task to load the ssh-key before executing the other tasks in the playbook. For this, add the local host to your Ansible inventory:

1
2
[local]
local_machine ansible_connection=local ansible_host=localhost

Now we can add a small shell task to load the ssh-key:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
---
- hosts: local
- name: load ssh key
  shell: |
      ssh-add ~/.ssh/id_rsa

- hosts: webserver
  tasks:
      - name: Ensure repo is up-to-date
        git:
            repo: git@github.com/ntlx/my-private-repo.git
            dest: repos/my-private-repo

When you now execute the playbook, you shouldn’t need to load the ssh-key before.

Backup Your Files with simple Bash Scripts

Juni 25, 2017 8:38 pm Veröffentlicht von

Ever lost data you stored on a usb drive just because it stopped working and you did not have a backup? How often did you promise yourself to set up a backup system so this will not happen again – just a few days before forgot you wanted to do so? You are not alone – so did I. Until a few months ago, when I decided to store my data on my own NAS, run by a RaspberryPi 3 and OwnCloud, to give me the feeling to have control over where my data is physically stored. On a USB drive below my desk. Without a popup reminding me, my Dropbox is running out of space.

As hard drives tend to fail, I decided to put a backup system in place so the data is safe as long as only one of the two hard drives stops working. And this was quite easy, so I want to share the simple bash scripts I use to create incremental backups of my data.

The Strategy

First, here is the backup strategy I implemented:
 – Over 5 Years, I want to keep a backup of the data, as it was in the beginning of that year
 – Over the last year, I want to keep the first backup of each month
 – Over the last month, I want to keep the backup of every Monday
 – Over the last week (7 days), the backups of every day are kept

That sounds like a high amount of data to store. But it is not, if you use the rsync argument –link-dest <folder> which makes rsync create hard links in the target folder to the folder we pass as an <folder> argument, instead of creating actual copies of the source. So, only a bit more space than the actual copy in the beginning is needed for every new backup. That is the data that actually changed – hence the data we want to back up, plus some overhead for folders and the hard links.
Here is the command we can use to create such incremental backups with rsync:

1
rsync -a --delete --link-dest ${LASTDAYPATH} ${DATADIR} ${TODAYPATH}
This command creates a backup of ${DATADIR} to ${TODAYPATH} creating links of unchanged data to ${LASTDAYPATH}.

The Scripts

Such a command should now be executed every night using a cron job.

 1
2
3
4
5
6
7
8
9
10
11
12
13
#!/bin/bash

TODAY=$(date +%Y-%m-%d)
BACKUPDIR=/nas/backup/daily/
SCRIPTDIR=/nas/data/backup_scripts
DATADIR=/nas/data/
LASTDAYPATH=${BACKUPDIR}/$(ls ${BACKUPDIR} | tail -n 1)
TODAYPATH=${BACKUPDIR}/${TODAY}
if [[ ! -e ${TODAYPATH} ]]; then
mkdir -p ${TODAYPATH}
fi
rsync -a --delete --link-dest ${LASTDAYPATH} ${DATADIR} ${TODAYPATH} $@
${SCRIPTDIR}/deleteOldBackups.sh
The data hard drive is mounted to /nas/data, the backup hard drive is mounted to /nas/backup. Every day the backup scripts creates a backup of the data drive to the backup drive (in the folder daily – which might be a misleading name as we store all the backups in it).

At the end of the script, we trigger another script deleting all the old backups, which are not needed anymore according to the backup strategy above.

 1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
#!/bin/bash

BACKUPDIR=/nas/backup/daily/
function listYearlyBackups() {
for i in 0 1 2 3 4 5
do ls ${BACKUPDIR} | egrep "$(date +%Y -d "${i} year ago")-[0-9]{2}-[0-9]{2}" | sort -u | head -n 1
done
}

function listMonthlyBackups() {
for i in 0 1 2 3 4 5 6 7 8 9 10 11 12
do ls ${BACKUPDIR} | egrep "$(date +%Y-%m -d "${i} month ago")-[0-9]{2}" | sort -u | head -n 1
done
}

function listWeeklyBackups() {
for i in 0 1 2 3 4
do ls ${BACKUPDIR} | grep "$(date +%Y-%m-%d -d "last monday -${i} weeks")"
done
}

function listDailyBackups() {
for i in 0 1 2 3 4 5 6
do ls ${BACKUPDIR} | grep "$(date +%Y-%m-%d -d "-${i} day")"
done
}

function getAllBackups() {
listYearlyBackups
listMonthlyBackups
listWeeklyBackups
listDailyBackups
}

function listUniqueBackups() {
getAllBackups | sort -u
}

function listBackupsToDelete() {
ls ${BACKUPDIR} | grep -v -e "$(echo -n $(listUniqueBackups) |sed "s/ /\|/g")"
}

cd ${BACKUPDIR}
listBackupsToDelete | while read file_to_delete; do
rm ${file_to_delete}
done

The idea of this script is to first list all the backups that should be kept, according to our strategy, and afterwards invert this selection to find out the ones to delete.

And that’s it! Not much magic in creating incremental backups without needing too much space. My NAS is running these scripts every night since 10 months now, currently backing up 607 Gigabytes. The backups currently take 630 Gigabytes. Find the current version of my simple bash scripts in this GitHub repository: https://github.com/NautiluX/backup_scripts