Drupal 7 & Robots.txt: Disallow all subdomains or prevent subdomain from being indexed on google

When you are running multiple Drupal sites from a single code base (multisite) and you need a different robots.txt file for each one, use RobotsTxt module.

This module generates the robots.txt file dynamically and gives you the chance to edit it, on a per-site basis, from the web UI.

Note: You must delete or rename the robots.txt file in the root of your Drupal installation for this module to display its own robots.txt file(s).

If you have in an issue like "Cannot apply different robots.txt for each multi-site", this solution will probably work better for you.