how to backup and restore transmission files?

I don’t quite grasp the relationship between magnet links and torrent file. That being said, I’m Looking to backup transmission “files” — but which files?

Most downloads I’ve added to transmission originate as magnet links. Yes, there are .torrent files in the config folder as I would hope.

For all practical purposes, it’s the actual torrent file and corrosponding download which should be backed up?

The magnet link itself doesn’t seem to persist in any way.

GUI tool to backup PPAs and list of packages

I am looking for a GUI tool to backup and restore the list of PPAs and packages installed on the system.

I was using Aptik but now the free version is not maintained anymore and I need an alternative.

That tool was useful to me because I could see the list of PPAs and a column with the list of packages installed by the specific PPA. Besides, in the list of packages I could see a description of each package in order to better choose and at first access to that view a default pre-selection was made (based on I don’t which what logic). That’s why I am looking for something with GUI and not simple commands.

Any suggestion please?

Thanks

How does /etc/fstab backup data?

I was reading about /etc/fstab file, and found out that this file can dump data, and when searched for it, I understood that dump means backup.

# <file system>                 <dir>       <type>    <options>     <dump> <pass> UUID=6a454a-bfd1-38989910eccd    /           ext4      defaults       1      1 
  1. I want to know what it backup? The whole filesystem?

  2. where is the backup file?

  3. When does it work? On every boot?

  4. Does it need an external program called dump?

How to securely wipe the disk and use Time Machine for backup

I have a question regarding secure wipe of SSD. I read the article (https://www.makeuseof.com/tag/ssd-secure-delete-data/) how to securely erase the SSD with Parted Magic. But I have a question: when I will use on Macbook Pro a Time Machine utility to backup some files, which I want to save and also the system settings, SW etc., but before I will use Parted Magic, I will delete some “secret files” in OSX (or move them to USB), after that I will make a backup (Time Machine) and then I will use Parted Magic for secure SSD wipeout…the SSD should be clean. After this I will restore from backup (where the deleted “secret files” should not be) with Time Machine…is it for sure, that the deleted files won’t appear again? It sounds bit complicated…but for me it means, that I will save my secret files and prevent their recovery after backup-restore of my Macbook Pro (2019, with T2 Security Chipset and Mojave 10.14 OSX) Thanks

Backup con errores

Hace poco comence en el mundo de WordPress y estoy tratando de sumar puntos con las empresas que voy laburando. Resulta que una de estas, cambio de hosting completamente asi que la web (que no realice yo ni puedo contactar a quien la hizo) cayo y volvieron a levantar el dominio en otro Hosting, a lo que me pidieron si podia subir la web nuevamente.. La primer alternativa que probe es con el All in one Migration, tenia un backup hecho asi que intente subirlo varias veces, sin exito, hasta que pude subirla pero tengo un problema con el Logo Principal, me aparece de esta forma Menu incorrecto

Cuando en realidad deberia aparecer asi.. Menu Correcto

Realmente se muy poco de codigo, soy mas de la parte intuitiva de wp, revise todas las formas posibles de acomodarlo dentro de mis conocimientos, pero nada! A alguien se le ocurre que pueda ser que no me deje poner el logo como estaba originalmente?.

La otra opciĆ³n que empece a pensar es realizar otro tipo de backup, tengo un Winrar con el backup de la web completa + dump SQL. Esto los realizaron gente de sistema ajena a mi, pero tengo acceso a esos archivos. Existe alguna guia o forma de subir manualmente todo y que funcione tal cual estaba anteriormente? (Plugins, entradas, etc). Intente realizarlo en local, pero hasta ahora no logre que levante la web. Desde ya muchas gracias y ojala alguien pueda darme una mano!

How to implement an incremental backup strategy? [on hold]

We have a project where we need a backup solution to backup websites. It is basically files and a SQL database.

The websites are running PHP. We can only get in touch with the website through HTTP(S). No (S)FTP or SSH. We have no control to the server configuration. See the problem as it is a WordPress website where the owner will install a plugin. Therefore, that explains the no FTP/SSH constraint.

We did not found any solution that fit that.

The tasks:

  • Get the (modified) files
  • Store them
  • Reassemble a backup when needed.

We must be able to restore a website from any point with the last 365 days.

What are the possible strategies to store incremental file backups and be able to reassemble the incremental backup EFFICIENTLY when needed (one zip file) ?

In other words, how do you store the increments you get in a way to make things simple and be able to reassemble the increments as fast as possible.

Should I backup root or home?

My aim is to back up my laptop so it is restored just like it was before, with all the files, apps, app data, themes, and settings. I’m afraid that if I only backup home then my themes and settings will not backup, and only my personal files will, is that true? I’m not very familiar with the filesystems yet…

How to implement a good incremental backup strategy? [on hold]

We have a project where we need a backup solution to backup websites. It is basically files and a SQL database.

The websites are running PHP. We can only get in touch with the website through HTTP(S). No (S)FTP or SSH. We have no control to the server configuration. Let’s assume our sites are on a shared hosting.

I have not seen any solution on the market where we can ask the provider to contact the website through HTTP(S) to initiate the process and ask a script on the web server to push files to the backup solution.

I came up with that scenario to solve the issue:

  1. Initially, the backup server calls the web server and asks for a full backup.
  2. The web server responds by dumping the database in a file and pushes it to the backup server.
  3. The web server also sends all files through HTTP.
  4. The backup server receive every file, stores them, and gets a hash for each file.
  5. Process completed.

The next time a backup is required, here is what happen:

  1. The backup server calls the web server and asks for an incremental backup. The backup server includes a file containing all files hashes.
  2. The web server dumps the full database and pushes it to the backup server.
  3. The web server browse every file and looks for file that have a different hash from what was sent from the backup server (modified file), files that are not already in the list (added file), files that are in the list but no longer on the file system (deleted file). Files are pushed accordingly.
  4. The backup server receive every file, stores them, and gets a hash for each file.
  5. Process completed.

We must be able to restore a website from any point with the last 365 days.

Here are my questions:

  1. Are these scenarios viable?
  2. Where can I find information about “how to implement an incremental backup solution on the backup server”? I’m looking for in-depth explanation or implementation. Not some basic article about “what is an incremental backup”.
  3. What is a good strategy to store the backups… but also to restore efficiently?

Do not hesitate if you need more details. I try to give not too much information as I don’t want to add constraints.