Is Allow: in robots txt an exclusion to an exclusion?

Say I have a MediaWiki website and all the Special: namespace webpages are excluded with robots.txt Disallow: Special: but there are a few specific Special: webpages that I do want to include;

  • Allow:Special:RecentChanges
  • Allow:Special:RandomPage
  • Allow:Special:Categories

Is Allow: in robots txt an exclusion to an exclusion?

To ask a more specific and two-factored question: Is the code above what I need to add to robots.txt and is it correct to say that these allocations are "exclusions to the (general) exclusion"?