Do the order of “Disallow” and “Sitemap” lines in robots.txt matter?

One can sort robots.txt this way:

User-agent: DESIRED_INPUT Sitemap: https://example.com/sitemap-index.xml Disallow: / 

instead:

User-agent: DESIRED_INPUT Disallow: / Sitemap: https://example.com/sitemap-index.xml 

I assume both are okay because it’s likely the file is compiled in correct order by generally all crawlers.
Is it a best practice to put Disallow: before Sitemap: to prevent an extremely unlikely bug of a crawler’s bad compilation of crawling before ignoring Disallow:?