1 個回答
- 最新
- 最多得票
- 最多評論
0
Hello.
Looking at the document below, it seems that if you return "X-Robots-Tag: noindex" in the response header, it will not be indexed.
In other words, I thought it would be a good idea to add a process on the application side that would return "X-Robots-Tag: noindex" in the response header when accessed from a domain other than the domain.
https://developers.google.com/search/docs/crawling-indexing/robots-meta-tag
Or, how about configuring the web server (Apache, etc.) to block access from users other than the domain?
<VirtualHost *:443>
ServerName howtoabroad.com
DocumentRoot /var/www/html
SSLEngine on
SSLCertificateFile /path/to/your/certificate.crt
SSLCertificateKeyFile /path/to/your/private.key
SSLCertificateChainFile /path/to/your/chainfile.pem
<Directory "/var/www/html">
RewriteEngine on
RewriteCond %{HTTP_HOST} !^howtoabroad\.com$
RewriteRule ^.*$ - [F,L]
</Directory>
</VirtualHost>
相關內容
- 已提問 7 個月前
- AWS 官方已更新 7 個月前
- AWS 官方已更新 1 年前
- AWS 官方已更新 3 年前
- AWS 官方已更新 1 年前
These domains seem to point at an instance. Not a Lightsail container endpoint. Can you clarify why you mention "lightsail container" please?