Skip to content

aon4o/traefik-plugin-robots-txt

 
 

Repository files navigation

Robots.txt Traefik plugin

Table of Contents

  1. Description
  2. Setup
  3. Usage
  4. Reference
  5. Development
  6. Contributors

Description

Robots.txt is a middleware plugin for Traefik which add rules based on ai.robots.txt or on custom rules in /robots.txt of your website.

Setup

# Static configuration

experimental:
  plugins:
    robots-txt:
      moduleName: github.com/aon4o/traefik-plugin-robots-txt
      version: {{latest tagged version}}

Usage

# Dynamic configuration

http:
  routers:
    my-router:
      rule: host(`localhost`)
      service: service-foo
      entryPoints:
        - web
      middlewares:
        - my-robots-txt

  services:
   service-foo:
      loadBalancer:
        servers:
          - url: http://127.0.0.1
  
  middlewares:
    my-robots-txt:
      plugin:
        robots-txt:
          aiRobotsTxt: true

Reference

Name Description Default value Example
aiRobotsTxt Enable the retrieval of ai.robots.txt list false true
customRules Add custom rules at the end of the file \nUser-agent: *\nDisallow: /private/\n
overwrite Remove the original robots.txt file content false true

Development

This is a fork of the original Plugin by Solution Libre. If you want to contribute, go to his repo.

About

Traefik plugin to create, overwrite or complete the robots.txt file

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Go 97.4%
  • Makefile 2.6%