Calculating tips

I am enrolled in an online JavaScript course and one of our coding challenges was presented as follows:

John and his family went on a holiday and went to 3 different restaurants. The bills were $ 124, $ 48, and $ 268.

To tip the waiter a fair amount, John created a simple tip calculator (as a function). He likes to tip 20% of the bill when the bill is less than $ 50, 15% when the bill is between $ 50 and $ 200, and 10% if the bill is more than $ 200.

In the end, John would like to have 2 arrays:
1) Containing all three tips (one for each bill)
2) Containing all three final paid amounts (bill + tip)

I have come up with the following solution:

var bills = [124, 48, 268]; var totals = []; var pointlessArray = [];  function calculateTip(cost) {     switch (true) {         case cost < 50:             return cost * .2;             break;         case cost > 49 && cost < 201:             return cost * .15;             break;         case cost > 200:             return cost * .1;             break;         default:             Error('Unsupported input.');     } }  function makePointlessArray(inputArray) {     var length = inputArray.length;     for (var i = 0; i < length; i++) {         pointlessArray[i] = calculateTip(inputArray[i]);     } }  function calculateTotal(billArray) {     var length = billArray.length;     for (var i = 0; i < length; i++) {         totals[i] = billArray[i] + calculateTip(billArray[i]);     } }  makePointlessArray(bills); calculateTotal(bills);  console.log(`The bills are: $  {bills}`); console.log(`The calculated tips are: $  {pointlessArray}`); console.log(`The calculated totals are: $  {totals}`); 

I don’t think this is practical at all for calculating tips, but have tried to stay within the parameters of the challenge.

I am unsure if declaring the arrays as global variables is the best practice or if some other method should be used, but as a JS newbie I would appreciate any input on pitfalls in my code.