Prevent search engines from crawling and indexing all domains and subdomains set up under an Apache server

My company is renting a CentOS server, with one main domain and ~30 subdomains (the number varies, but it’s usually around 30) hosted on it. The subdomains themselves are used for various purposes (mostly development), and are usually deleted once they outlive their use.

Recently it has come to our attention that some of the search engines have indexed some of our defunct and some of our active subdomains.

What would be the best way of preventing search engines and crawlers from crawling and indexing the main site and its subdomains?

I am aware that it’s possible to create a robots.txt file with noindex, nofollow, etc directives, but I’d like to avoid it, if possible (going through ~30 subdomains, setting up a new robots.txt when a new subdomain is created, etc).

I’m also aware that it’s possible to use:

<IfModule mod_headers.c>   Header set X-Robots-Tag "noindex, noarchive, nosnippet, nofollow, noodp, noydir" </IfModule> 

in .htaccess. What I’m unsure of is whether I could place this directive inside of main domain’s .htaccess, in hopes of preventing crawling for both the main domain and its subdomains.

Is there a better way of doing this? I’ve come across a StackOverflow answer that Alias can be used in httpd.conf for robots.txt. Can this be used for .htaccess as well?