Cronjob not firing function within class that extends WC_Email

I’m really frustrated right now, because I just can’t seem to figure out what’s wrong here. Maybe it’s something really simple and I’m just stuck somewhere. Please help me out!

I’m working on a small plugin to add a payment reminder to WooCommerce. I added a wp_schedule_event (nd_payment_reminder) to the activation hook. This works so far.

In my main plugin file I’m including my class which looks a little like this:

class ND_Payment_Due extends WC_Email { public function __construct() {         parent::__construct();         add_action('nd_payment_reminder', array($  this, 'getOpenOrders')); }  public function getOpenOrders() {         // getting data and triggering the email } [...] } 

Whatever I try, the function getOpenOrders seems not to be started. I’ve copied this plugin from another website where I originally used it and it still works there – so I’m really confused right now.

I tested it from the functions.php – this works:

add_action('nd_payment_reminder', 'testfunction'); 

I also thought my getOpenOrders()-Function might be faulty, but even when I use another simple function from within my class it’s just not working. 🙁

Any ideas would be highly appreciated! Thanks in advance!

CronJob in Ubuntu hanged halfway through and it removed the pid and wont reiniitiate the PID

PID hanged, got removed and it wont restart again. I am running Ubuntu on Azure and i have a couple of cronjobs that it is running perfectly. however I cannot explain the random cron job failures which the logs have not been able to show .

is there a way i can find out whether my cronjob have been failing.

Restart VPN using cron-job service

Is there any way to automate restarting/resetting my VPN service after my killswitch kicks on? I imagine that a cronjob will need to be setup for every minute to check to see if I have an active connection if not a .sh file needs to run that stops the service and restarts it. When you perform pings is there any way to capture that data? I have performed pings with the killswitch on before and it won’t return anything to me. Does anyone have any ideas or am I approaching this the wrong way?


Tar is not Finishing When Running as Cronjob

My goal is to regularly pack the most important files on a server into a tar.gz file. In order to automate this I am using a cronjob to execute a script:

$   crontab -l # blablabla * * * * * /home/backup/ >> /home/backup/output.txt  

For debugging reasons, I am currently running the job every minute.

As far as I can judge, it is not a cron problem, since the job executes every minute and runs as root (whoami in the script prints root).

The script that is running looks as follows:

#!/bin/sh -l  echo "Starting ..."  DATE=`date '+%Y-%m-%d_%H-%M-%S'` BACKUP_PATH="backups/$  DATE.tar.gz"  # MAKE BACKUP OF ENTIRE DATABASE echo "Backing up database ..." DATABASE_BACKUP_PATH="/home/backup/mysql.sql" docker exec central-mysql sh -c 'exec mysqldump --all-databases -uroot -p"xxx"' > "$  DATABASE_BACKUP_PATH"  # MAKE ARCHIVE echo "Creating tar archive ..." tar -vczf "$  BACKUP_PATH" \         /home/database/docker-compose.yml \         /home/ \         /home/gitlab/docker-compose.yml \         /home/gitlab/data/git-data/repositories \         /home/mailserver/docker-compose.yml \         /home/mailserver/mail/dkim \         /home/mailserver/rainloop/_data_/_default_/configs/application.ini \         /home/mailserver/rainloop/_data_/_default_/domains \         /home/mailserver/ \         /home/php7-apache-alpine \         /home/sftp \         /home/traefik \         /home/websites \         "$  DATABASE_BACKUP_PATH"  # CLEAN-UP echo "Cleaning up ..." rm "$  DATABASE_BACKUP_PATH" 

The docker command runs without problem, creating a .sql file. However, the tar command starts to list the first few files, and then just stops somehow. output.txt looks like this:

Cleaning up ... Starting ... Backing up database ... Creating tar archive ... /home/database/docker-compose.yml /home/ /home/gitlab/docker-compose.yml /home/gitlab/data/git-data/repositories/ /home/gitlab/data/git-data/repositories/websites/ /home/gitlab/data/git-data/repositories/websites/sbpp.git/ /home/gitlab/data/git-data/repositories/websites/sbpp.git/config /home/gitlab/data/git-data/repositories/websites/sbpp.git/HEAD /home/gitlab/data/git-data/repositories/websites/sbpp.git/hooks /home/gitlab/data/git-data/repositories/websites/sbpp.git/info/ /home/gitlab/data/git-data/repositories/websites/sbpp.git/info/exclude Cleaning up ... 

Question: Are there known reasons for tar to just stop compressing? Do you see errors in my script?

The strange thing about it is that when I run the script manually, it works just fine. Is it a cronjob thing after all? Do I have permission problems, even though I am running as root?

How to download zip archive and extract file in Google App Engine cronjob?

I’m new to GCP and working under preexisting App Engine.

I have a task to do basically, hit a 3rd party website (HTTP GET) to download a zip file, extract a specific file from the archive (which happens to be a TSV file), do some very basic processing on it and then store the results in our internal systems (Happens to be a CloudSQL instance).

They already have some patterns of defining python cronjobs via cron.yaml in the App engine so I was looking toward putting in there. I’m avoiding wanting to do this all in memory since the file might be a few gigs. If I wasn’t GCP with a local filesystem I could work with, I’d have it download the file and unzip on disk, open a file handle and read through the file stream that way.

Do I use blobstore in place of a local filesystem?