Is there a common unit of measurement for comparison of computing power used to solve mathematical puzzles?

I just wanted to make sure if I’m reasoning correctly. So, if two computers are solving the same mathematical puzzle i.e. SHA-256 function of Bitcoin (finding nonce that satisfies difficulty target), the computing power used by each of them can be compared in units of Gigahashes, Terahashes etc. and we know which one used more computing power in time. But what if we wanted to compare computing power (herein called “CP”) used by two computers in time t if those two computers were solving different hash functions (e.g. one SHA-256 and one SHA 3-512) or different mathematical functions altogether? Could we then denominate computation power used by Computer 1 and 2 to reach a conclusion that in time t: Computer’s 1 CP > Computer’s 2 CP and by how much? And what would be that proper and best denominator (FLOPS, MIPS, IPS..)?