What color format, depth buffer should I use in a RenderTexture for WebGL screen output in Unity? Is NPOT ok? MipMaps?

Today I learned about a technique for forcing uniform resolution in a Unity WebGL build, as described here in the answer by DMGregory. Yay! I’m experimenting with implementing this technique on a large ongoing project and am seeing promising results so far. However, I am a little bit baffled as what color format I should choose for the RenderTexture. It seems to me it would be most efficient to choose something without an Alpha channel, but that’s just a hunch. There are seemingly zillions of formats (not all supported on WebGL): RenderText Inpector

partial list of RenderTexture color formats

With the build-test cycle on WebGL being slow as it is, I’m looking for some insight as to how to choose the most efficient color format. And what about the depth buffer setting?

Finally, does it matter if the RenderTexture is NPOT? Again my hunch is to have it be as few pixels as possible, even if NPOT, but ¯_(ツ)_/¯ Same goes for MipMaps — seems like wasted memory space but I’m guessing here.

Edit: I found this, which does explain the nomenclature at least, but I’m still unsure how I should be thinking about this.