Consider these equivalent functions in C and Python 3. Most devs would immediately claim both are O(1).

`def is_equal(a: int, b: int) -> bool: return a == b `

`int is_equal(int a, int b) { return a == b } `

But consider what is happening under the surface. Integers are just binary strings and, to determine equality, both languages will compare the strings bit-by-bit. In either case this scan is O(b) where b is the number of bits. Since integers have a constant size in bits in C, this is simply O(1).

In Python 3 however, integers do *not* have fixed size and the scan remains O(b) for the number of bits in the input, or O(log a) where a is the value of the input in base 10.

So if you’re analyzing code in Python, any time you compare two integers, you are embarking on a surprisingly complex journey of O(log n) with respect to the base 10 value of either number.

For me this raises several questions:

- Is this correct? I haven’t seen anyone else claim that Python compares ints in log time.
- In the context of conducting an interview, should you notice or care if a candidate calls this O(1)?
- Should you notice or care about this distinction in the real world?