## Shortest sequence of jobs, with dependencies, subject to capacity constraints

Suppose I have $$n$$ courses, some with some prerequisites, and I can take up to $$k$$ courses in a semester. I want to compute the least number of semesters needed to complete all courses, while respecting prerequisites.

Equivalently: suppose I have a dag $$G=(V,E)$$, and a positive integer $$k$$. The desired output is a sequence $$S_1,\dots,S_m$$ of vertices that minimizes $$m$$, subject to the constraint that $$S_1 \cup \dots \cup S_m = V$$, $$|S_i| \le k$$ for all $$k$$, and every edge goes from some set $$S_i$$ to $$S_j$$ where $$i (i.e., there is no edge $$v \to w$$ where $$v\in S_i$$, $$w\in S_j$$, and $$i \ge j$$).

Is there a polynomial-time algorithm for this problem?

Approaches I’ve considered:

The obvious greedy strategy is a variant of Kahn’s algorithm is: in each semester, arbitrarily pick $$k$$ courses whose prerequisites have all been previously taken, and take those $$k$$ courses. Unfortunately, this algorithm is not guaranteed to generate an optimal schedule. For example, in the graph with vertices $$v_1,v_2,v_3,v_4$$ and the single edge $$v_1 \to v_2$$, with $$k=2$$ this algorithm might generate the schedule $$\{v_3,v_4\},\{v_1\},\{v_2\}$$, which is longer than the optimal schedule $$\{v_1,v_3\},\{v_2,v_4\}$$.

The next natural idea is to modify the above approach by breaking ties in favor of vertices that are part of longer dependency chains. I’m not sure whether this works or not.

Inspired by taking school courses efficiently.

## Does anyone know how to make the extra data, about the number of jobs on page, show in google search results?

I have noticed that when I google for jobs for example ‘plumber jobs in Melbourne’ some results have a prepended piece of data ‘407 jobs’ before the normal meta description is shown.

Anyone know what seek has done to get this data shown in google search results?

## Should backup jobs be partitioned by type and applicable regulations of data?

Looking at how Microsoft categorizes data, customer data can be broken into a few different categories:

• Customer Data
• Customer Content
• Personal Data

Should backup jobs be configured in a way to support special needs, such as

• retention
• deletion (partial or impartial for right to be forgotten)
• access control (catalog, or metadata)

## Why cron jobs didn’t be set correctly by Ansible playbook?

I created this playbook to set crontab:

- name: Set PATH to crontab   cron:     name: PATH     env: yes     user: barman     job: /usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/var/lib/barman/.local/bin:/var/lib/barman/bin:/usr/pgsql-10/bin/  - name: Automatically run backup for App1   cron:     name: "Run Backup for App1"     minute: "0"     hour: "3"     user: barman     job: "barman cron && barman backup app1"  - name: Automatically run backup for App2   cron:     name: "Run Backup for App2"     minute: "0"     hour: "4"     user: barman     job: "barman cron && barman backup app2" 

But I only found this under /etc/cron.d/barman file:

# m h  dom mon dow   user     command   * *    *   *   *   barman   [ -x /usr/bin/barman ] && /usr/bin/barman -q cron 

It seems didn’t set task correctly.

## [ Politics ] Open Question : Are Americans really okay with robots taking jobs?

There are several industries relying on AI and everyone seems to be okay with it. If you think that people affected will be getting some kind of check any time soon or ever to make up for this, you’re crazy. Why isn’t anyone bothered by this?  Some of you are pitiful. Hopefully you’ll remember this when you’re in your cardboard box and listening to the violent growls of your empty stomachs.

## WP Cron jobs loops infinitely

I am writing a script to add a named Cron job that updates a single user, that runs every 5 minutes or so.

My problem is that the job runs for every user over and over again every second or so. Here is the code that I have placed inside my functions.php file.

This is my first foray into the WP Cron functionality with WordPress and would like to know if I set up the jobs correctly.

function so_custom_cron_schedule( $schedules ) {$  schedules['every_5_minutes'] = array(         'interval' => 300,         'display'  => __( 'Every 5 minutes' ),     );     return $schedules; } add_filter( 'cron_schedules', 'so_custom_cron_schedule' ); function update_social_user($  user_id){   $user = get_userdata($  user_id);   if(!$user){ return; } var_error_log('running for '.$  user_id); }  function assign_cron(){   $users = get_users([ 'role__in' => [ 'administrator', 'seller'] ]);$  args = array(false);   foreach($users as$  user){     $hook_name = 'update_fb_'.$  user->ID;     add_action($hook_name,'update_social_user'); if(!wp_next_scheduled($  hook_name,$args)){ wp_schedule_event(time(),'every_5_minutes',$  hook_name,array(\$  user->ID));     }else{       var_error_log('Already set');     }   } }  assign_cron(); 

## Question about jobs regarding Datacenter Security / Cloud Security

first post here!

I had a question in regards to pursuing a career related to Datacenter Security (Iron Mountain type company or Data Integrity type company) or Cloud Security where I am not tied to a desk. I have a huge issue with being around an environment where I cannot be constantly on the move. I was looking into DLP or anything related to surveillance, but can’t seem to pinpoint the exact job. Does anyone have any idea of the job that would be the best for me?

TL;DR

Looking for a job prospect where my skills in network security/infosec can be translated into installing and securing physical and logical based systems without being just tied to a desk.

## Why aren’t distributed computing and/or GPU considered non-deterministic Turing machines if they can run multiple jobs at once?

So we know a nondeterministic Turing machine (NTM) is just a theoretical model of computation. They are used in thought experiments to examine the abilities and limitations of computers. Commonly used to dicuss P vs NP, and how NP problems cannot be solved in polynomial time UNLESS the computation was done on the hypothetical NTM. We also know an NTM would use a set of rules to prescribe more than one action to be performed for any given situation. In other words, attempt many different options simultaneously.

Isn’t this what distributed computing does across commodity hardware? Run many different possible calculations in parallel? And the GPU, does this within a single machine. Why isn’t this considered an NTM?

## Job assignment where each worker handles two non-consecutive jobs

There are $$N$$ workers and $$2N$$ jobs, named from $$J_1$$ to $$J_{2N}$$. There’s a matrix $$M$$ denoting the subset of jobs can be handled by each worker: If $$M_{i, j}$$ is true, then worker $$i$$ can do job $$j$$.

Our task is to assign exact 2 jobs for each worker, s.t., each job is handled by exact one worker with respect to $$M$$. (So far the problem can be solved with max flow.) Moreover, If a worker $$i$$ handles job $$j$$, it can’t handle job $$j+1$$.

2. If there is, find out a solution to $$\max_{Assignment}{\min_{i} {\left| J1_i – J2_i \right|}}$$, where $$J1_i$$ is the first job assigned to worker $$i$$, and $$J2_i$$ is the second job assigned to worker $$i$$. In other words, to maximize the minimum interval between two jobs for all workers.