Skip to content

Automatically add robots.txt #15

@gkpty

Description

@gkpty

When deploying a dev/test bucket, add a robots.txt to tell google to crawl none.

User-agent: *
Disallow: /

When deploying a production bucket change robots.txt to:

User-agent: *
Allow: /

Only add a robots.txt file if one doesnt already exist.

Add a in the upload method that will allow/disallow user agents depending if its prod/test

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions