Out of a 1000 time slices the OS itself uses 100 of these (assume that these are spread at regular intervals over each second) it can make 900 such time slices available for virtual machines each second. The computer needs to run 10 programs that each require a time slice at least 20 times per second and one program (the user interface) that requires a time slice at least once every 10 milliseconds.
Does this mean that:
OS = 100 time slices
10 Programs = 200 time slices
UI = 100 time slices
Resulting in 600 time slices left for other programs to run?