Skip to content
This repository has been archived by the owner on Nov 20, 2024. It is now read-only.

No-robots.sh entrypoint fails if using non-standard Drupal root. #130

Open
grahamethompson opened this issue Feb 4, 2020 · 2 comments
Open

Comments

@grahamethompson
Copy link

If /app/web is not being used as Drupal root, for example /app/docroot, no-robots.sh:7 fails to write robots.txt in non-production environments.

if [ ! "${LAGOON_ENVIRONMENT_TYPE}" == "production" ]; then
printf "User-agent: *\nDisallow: /\n" > /app/web/robots.txt
fi

[govcms7-paas]nginx:/app$ sh /lagoon/entrypoints/20-no-robots.sh 
/lagoon/entrypoints/20-no-robots.sh: line 7: can't create /app/web/robots.txt: nonexistent directory

Maybe doing something like this would help PaaS scallywags?

if [ ! "${LAGOON_ENVIRONMENT_TYPE}" == "production" ]; then
    printf "User-agent: *\nDisallow: /\n" > /"${APP_DIR:-app}"/"${WEBROOT:-web}"/robots.txt
fi
grahamethompson added a commit to grahamethompson/govcmslagoon that referenced this issue Feb 4, 2020
@fubarhouse
Copy link
Contributor

Note that /app/web is utilized only in Drupal 8, where /app is used for Drupal 7.
Will have to double-check that, but to me the above is expected...

@stooit
Copy link
Collaborator

stooit commented Feb 4, 2020

The govcmslagoon/.docker/images/nginx/no-robots.sh entrypoint is simply incorrect for D7; should point to /app for SaaS and /app/docroot for PaaS.

There is another block in place at the nginx level (https://github.com/govCMS/govcmslagoon/blob/develop/.docker/images/nginx/helpers/200-no_robots_govcms_host.conf) - but this is not a blanket rule for all non-production environments.

Great pickup, thanks @thompsong

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants