Are we allowed to override robots.txt on a dev environment?

Quick answer: Not by default. Limited availability on newly rolled-out development servers.

Why is robots.txt (search engine crawling) disabled on DEV servers?
Sites on dev-servers are for development staging of website projects. The capacity of the servers would be exceeded if search-engines were to access the large number of unoptimized sites.

For a limited time, it is available for new dev-servers because the server's capacity can allow it. However, as soon as the limit is reached, search-engine crawling will be disabled on the dev-environment eventually.

What choice do I have if I want search-engines to index my site(s)?
For sites that need to be optimized for search-engines, production hosting should be used.

Shared production hosting pricing starts at $9.95/month (same as the cheapest development plan), while VPS hosting starts at $19.95/month. Be sure to check the pricing page for more information.

I still have further questions...
Feel free to get in touch with the support team by visiting the contact page.