Skip to content

Commit

Permalink
Add default robots.txt to block crawlers (#959)
Browse files Browse the repository at this point in the history
  • Loading branch information
sissbruecker authored Jan 26, 2025
1 parent 085d67e commit e6ebca1
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 0 deletions.
2 changes: 2 additions & 0 deletions bookmarks/static/robots.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
User-agent: *
Disallow: /
2 changes: 2 additions & 0 deletions uwsgi.ini
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ env = DJANGO_SETTINGS_MODULE=siteroot.settings.prod
static-map = /static=static
static-map = /static=data/favicons
static-map = /static=data/previews
static-map = /robots.txt=static/robots.txt
processes = 2
threads = 2
pidfile = /tmp/linkding.pid
Expand All @@ -18,6 +19,7 @@ if-env = LD_CONTEXT_PATH
static-map = /%(_)static=static
static-map = /%(_)static=data/favicons
static-map = /%(_)static=data/previews
static-map = /%(_)robots.txt=static/robots.txt
endif =

if-env = LD_REQUEST_TIMEOUT
Expand Down

0 comments on commit e6ebca1

Please sign in to comment.