1 Answer
- Newest
- Most votes
- Most comments
0
Hello.
Looking at the document below, it seems that if you return "X-Robots-Tag: noindex" in the response header, it will not be indexed.
In other words, I thought it would be a good idea to add a process on the application side that would return "X-Robots-Tag: noindex" in the response header when accessed from a domain other than the domain.
https://developers.google.com/search/docs/crawling-indexing/robots-meta-tag
Or, how about configuring the web server (Apache, etc.) to block access from users other than the domain?
<VirtualHost *:443>
ServerName howtoabroad.com
DocumentRoot /var/www/html
SSLEngine on
SSLCertificateFile /path/to/your/certificate.crt
SSLCertificateKeyFile /path/to/your/private.key
SSLCertificateChainFile /path/to/your/chainfile.pem
<Directory "/var/www/html">
RewriteEngine on
RewriteCond %{HTTP_HOST} !^howtoabroad\.com$
RewriteRule ^.*$ - [F,L]
</Directory>
</VirtualHost>
Relevant content
- asked 4 months ago
- asked 8 months ago
- asked 5 years ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 3 years ago
- AWS OFFICIALUpdated 2 years ago
These domains seem to point at an instance. Not a Lightsail container endpoint. Can you clarify why you mention "lightsail container" please?