Robots.txt directive confusion

I refer to this webpage: https://developers.google.com/search/docs/advanced/robots/robots_txt

Could someone elaborate on the difference between these two:

http://example.com/page.htm

allow: /page disallow: /*.htm 

Applicable rule: disallow: /*.htm, because it matches more characters in the URL, so it’s more specific.

http://example.com/page.php5

allow: /page disallow: /*.ph 

Applicable rule: allow: /page, because in case of matching rules, Google uses the least restrictive rule.

They look rather similar but one disallows while the other allows.