I have a secure and private aws ec2 environment but I need to do some backups of mongodb, postgresql, so I have a separate ec2 instance for doing backup and occasionally allow 80 and 443 to allow install/update software on backup instance.
I use shell scripts to do backup job, it requires hardcoded password or credentials in scripts, I don’t feel it secure enough to have all credentials saved into one place — backup instance.
How to secure backup instance to avoid saving passwords/credentials in plain text, I also want to avoid saving passwords/credentials in memory or temporary files?
I use duplicity on a few Ubuntu servers to encrypt backups and send them to a backup server, which then sends another copy to rsync.net, and then, once a week, give it or take, I download these backups to a local server.
The problem I have with this is that to send these encrypted archives to the backup server, each server has a password-less SSH key that allows them to connect to the backup server.
While each server has its own user on the backup server, file changes are monitored with OSSEC and the user only has permission to write to its own backup directory, I still fear that a compromised server — thinking ransomware, to be specific — could damage the backup server as well.
I thought about doing the inverse and having the backup server connect to the other servers, grab what it needs, and then shut itself down, but that seems worse, as a compromised backup server would have access to the entire server inventory.
So, I am wondering what is the best solution to keep backup servers safe? Is there a better software than duplicity to handle this?
Thanks in advance!
EDIT: Few details I forgot to include, might be worth something.
- Backup server is set up with a hardware RAID controller in RAID 10 with 15 drives
- Backup server uses Ubuntu 18.04 with EXT4 as the filesystem
- Backup server is a dedicated server with plenty of RAM and CPU power
- Backups stay inside /var/backups/SERVER-NAME/
- Client servers are all unprivileged LXD containers
For some reason I need to read the LSN from the T-SQL logs backups without restoring them or even their headers (I assume even restoring only their headers will change the LSN on the database side too, but I’m not sure).
So is the T-SQL log backup files encrypted or does they have special structure? Any information as to where should I start?
Could anyone confirm or denies that restoring the header only wouldn’t affect the sys.fn_dblog or anything else?
I have a folder with thousands of files sorted by modified date so I can resume from where I left. Recently, that HDD became full past the 90% mark, and filesystem access become so slow that system freezes for a while.
While my primary HDD is fast, the one I have those files is 30 MB/s read rate (Gnome Disks Benchmark). But it never did such a thing before. So I decided to run e4defrag. It solved the problem, no more freezes, but it messed up directories sorted by modified date.
Before I can clone that disk to a bigger one and expand the partition, I know I will need e4defrag again. Is there a way to backup files modified time and restore after e4defrag?
I want to back up the firmware of an SD card that I could later compare and see that no changes were made.
How do I back up the firmware of the microcontroller chip of an SD card? I’d prefer if it can be backed up within the operating system and with no additional hardware if that’s possible
Is there any free or low-cost “full backup then incremental backup” software options for Ubuntu…
…which is also capable of taking the previous full and the followup collection of incremental backups, and using them as a group, to construct a new synthetic full backup?
I want to copy a timeshift backup folder to my usb flash drive.
However anytime I try and move the file I get permission denied with this error http://i.imgur.com/eqBGuig.png
I have tried as sudo from terminal but same permission denied error, which is weird as I thought sudo let you do everything
I have a Dell XPS 9570 with dual boot setting (ubuntu alongside windows 10). Normally grub2 menu works like a charm, except a very special case; when an external HD containing a Clonezilla image is plugged in, then GRUB2 never shows up and windows boot automatically. The problem is not associated with the existence of an external HD in general, but only if this HD contains a Clonezilla image into one of its directories. Any idea why that happens?
Can I use the backup I created in 18.04.3 to restore files to 19.04?