Poor Man's Dropbox
Recently, I’ve started receiving a notification from Dropbox desktop app on Linux that they’ll stop syncing my files in November. I couldn’t understand why. I’m using this service for some time and I find it really useful. Moreover, I want to backup several important files on the web server in case of my disk crashes or I’ll need to access them from another computer or mobile device. I googled this issue and found pretty long thread on the Dropbox Forum. There’s also ongoing discussion on reddit.
If you don’t want to read all of this stuff, I can make a short summary:
- Dropbox app will support the following file systems: NTFS for Windows, HFS+ or APFS for Mac, and Ext4 for Linux
- Dropbox app will not support encrypted directories even if you’re using Ext4 file system on Linux
I had a few options:
- leave my Dropbox directory unencrypted
- switch to another OS
- change service provider to something else
- create my own solution
I’m not going to change my OS now and existing alternatives to Dropbox aren’t good enough.
I’ve decided to create Poor Man’s Dropbox with
cron. I think it’s nice opportunity to learn something new and solve my own problem at the same time. I have my own personal FTP server where I host my website. This server has about 4 GB of disk space. I’m not using all of this space because my website has only text and maybe a few small pics. I also store my photos on the local drive, Google Photos and external physical drive and I’m not going to backup them on the FTP server. I’d like to backup a several dirs with documents, config files, text files and spreadsheets. In such case, mentioned amount of disk space is enough.
In the beginning, I’ve created a special directory for backup in my local file system:
Next, I’ve created
backup/ directory on the FTP server.
After that, I’ve prepared
Once I had my script ready, I made it executable:
chmod +x make_ftp_backup
and placed it in
Now, I could test it by typing the following command in the terminal:
Script is pretty simple. It deletes old backup files from the local directory, copies updated files to this directory, removes old files on the FTP server and uploads fresh backup files to the server.
Of course, I’m not going to make backups manually, so I used
cron to automate this job.
crontab -e to define new cron job via
vim (you can use any editor of your choice).
It looks as follows:
# Backup my important files to my FTP server every hour at minute zero
Please note, we should not rely on
$PATH or other environmental variables in
crontab because most of them are ignored by this service. That’s why I provided complete path to the script.
We can list our cron jobs by typing
After editing, we could call:
sudo service cron reload
or (if it didn’t work):
sudo service cron restart
Cron job should work start working without it, but in case of problems, you can invoke that commands.
As you can see in the comments, I’m running my script every hour every day at minute zero.
I’ve found very nice video explaining how to use cron. If you’re not familiar with it, check this link.
Author of the video created useful code snippets explaining crontab syntax.
Reference information for cron jobs placed there looks as follows:
# ┌───────────── minute (0 - 59)
It really helps for preparing new cron jobs schedule. You can also visit crontab.guru website to test your cron jobs definitions. It’s also good to read more about troubleshooting crontab in case of encountering any problems.
Solution above just uploads local files to the remote server. Sometimes we may want to add some data from another computer or mobile device and access them later on the local computer. We shouldn’t place this data to the
backup directory because it may be overriden by the backup script. To solve this problem, we can create
input directory on the remote server and
input directory in the local file system. After that, we can create
get_ftp_input script, which will copy remote directory to local file system.
It’s not real synchronization like in the Dropbox, but at least we have simple way add new files to the local computer from the different device with Internet access and FTP client. We can add this script to the crontab too, if we want to.
On my Android phone, I’m using Solid Explorer app for browsing files and directories. It’s really good app and it has FTP client built-in, so I can access my backup folder from my mobile phone if I need to.
I know this solution is far from perfect (I call it poor myself) and it doesn’t handle two way synchronization, but at least I have full control over it and nobody tells me what file system or operating system I have to use or what I need to leave encrypted or unencrypted. It’s clean, simple and fine for me now. Maybe I’ll enhance this solution some day. E.g. I can add a new script performing backup whenever I change any file in defined list of directories. In order to do that, I’ll need a file watcher running as a system service in the background. The biggest room is the room for improvement.