Course teaching time complexities in real life systems


Having mis-read What course in CS deals with the study of RAM, CPU, Storage? I now wonder what course in CS deals with time complexities including GPUs, CPU caches in multiple levels, seek times on hard disk vs. SSDs, and bandwidth to disk and RAM.

I was taught the big O-notation but it never took into account that I might have a GPU with 100s of cores, or a limited amount of extremely fast cache, or a harddisk that is has a high bandwidth, but a high seek time.

Which class teaches this extended version of algorithm time complexities, which takes real world limitations into account?