Is it normal to be running out of spells?

I’ve just started playing D&D with a group, this is my first campaign for everyone in the group. I’m playing an elf Wizard and I feel like I’m struggling with attacks, and am not sure if I’m missing something. I’ve essentially 2 means of attack:

  1. Shortsword – but I need to be in close, and have little HP so use very infrequently.
  2. Spells – I’ve been using some spells, but I find that I quickly run out of spell slots. I’ve just leveled up to level 2 which helps but I’m quite quickly down to cantrips (of which I believe I’ve got 1 ranged attack – ray of frost).

So I quickly find myself in a predicament. I can just keep on ray of frosting monsters, or risk the chance of dying in close combat. Is this pretty typical, or have I missed something?

I should add that I’m hoping to buy a longbow at some point (but I couldn’t afford one last time we were at a shop) and in the current campaign we’ve had 3 encounters so far and will be continuing the next session straight from this one so I don’t have a chance for a long rest.

Finding the worst case running time of this piece of code?

I am working with this code:

function strange (list a[0..n-1] of integers such that abs(a[i]) ≤ n for every 0 ≤ i ≤ n - 1, list b[0..2n] of zeroes)  for i ← 0 to n - 1 do        a[i] ← a[i] + n for i ← 0 to n - 1 do        for j ← 0 to abs(a[i] - 1) do                b[j] ← b[j] + 1 return b 

I am trying to figure out the worst running time for the code above and so far I’m guessing that the first for loop will run n times, but not sure how to prove this. For the second and third for loop, I’m unsure how to approach this. If possible, could someone help me solve this?

Running PHP echo $_SERVER [‘DOCUMENT_ROOT’]; Shows Apache Default Path

Trying to get set up and running on a new hosting company after the old one announced they are discontinuing their service at the end of the year, I am having difficulty getting the sites to run. I narrowed it down to Apache’s DocumentRoot for each domain showing the Apache default path rather than showing the path to the individual site’s file location. In other words, when I run echo $ _SERVER ['DOCUMENT_ROOT']; in a test script, it shows the path as /etc/apache2/htdocs when it should show /home/username/public_html/domain.com. They seem unable to fix it so can DocumentRoot be changed through cPanel for each domain?

Difficulty understanding the use of arbitrary function for the worst case running time of an algorithm

In CLRS the author said

"Technically, it is an abuse to say that the running time of insertion sort is $ O(n^2)$ , since for a given $ n$ , the actual running time varies, depending on the particular input of size $ n$ . When we say “the running time is $ O(n^2)$ ,” we mean that there is a function $ f(n)$ that is $ O(n^2)$ such that for any value of $ n$ , no matter what particular input of size $ n$ is chosen, the running time on that input is bounded from above by the value $ f(n)$ . Equivalently, we mean that the worst-case running time is $ O(n^2)$ . "

What I have difficulties understanding is why did the author talked about an arbitrary function $ f(n)$ instead of directly $ n^2$ .

I mean why didn’t the author wrote

"When we say “the running time is $ O(n^2)$ ,” we mean that for any value of $ n$ , no matter what particular input of size $ n$ is chosen, the running time on that input is bounded from above by the value $ cn^2$ for some +ve $ c$ and sufficiently large n. Equivalently, we mean that the worst-case running time is $ O(n^2)$ ".

I have very limited understanding of this subject so please forgive me if my question is too basic.

How to prove that there is no algorithm with worst-case running time better than this one?

I have the following data:

  • A set $ V$ of tasks, the starting time $ s_j$ of each task and the duration $ p_j$ of each task.

  • A set $ K$ of resource, each resource has an availability function $ R_{k}$ that is piecewise constant.That is, for each $ t = 0, .., T-1$ , we precise $ R_{k}(t)$ the number of units available at $ t$ . $ R_k$ is an array of length $ T$ .

  • Each task $ j$ needs $ r_{j,k}$ resources to be processed (it could be zero). This quantity needs to be available during all the processing time starting from $ s_j$ .

Here is my attempt to verify that the resource utilization at each $ t$ is no larger than the availability function.

Algorithm

SQL Server always running

I have a personal computer where I have different instances of SQL server running (developer edition) with both integration services and analysis services (one in tabular and one in multidimensional). I use it for practice and to improve my skills. The start mode in Configuration Manager is "Automatic". So I have two instances each with both SSIS and SSAS. When I am not using SQL Server and e.g. will these services use a lot of resources on my computer by simply running in the background?

Thanks

running RAM on a given input

I understand how RAM commands work but I am unable to understand how we use a given input string and find the output. For instance,

there’s a Random Access Machine which has an input {0,1}*. The program logic is as follows:

1: read

2: store 1

3: read

4: add 1

5: read

6: add 1

7: load 1

8: if a=2 go to 11

9: print 0

10: goto 12

11: print 1

12: end

Now, on the input tape we have i=11101011. How can I find the content of the output tape? What’s the approach?

When we see read, do we only read the first character? If yes, then what exactly do we add 1 to? Is the output also supposed to be in binary?