1 Risposta
- Più recenti
- Maggior numero di voti
- Maggior numero di commenti
0
Hello.
Looking at the document below, it seems that if you return "X-Robots-Tag: noindex" in the response header, it will not be indexed.
In other words, I thought it would be a good idea to add a process on the application side that would return "X-Robots-Tag: noindex" in the response header when accessed from a domain other than the domain.
https://developers.google.com/search/docs/crawling-indexing/robots-meta-tag
Or, how about configuring the web server (Apache, etc.) to block access from users other than the domain?
<VirtualHost *:443>
ServerName howtoabroad.com
DocumentRoot /var/www/html
SSLEngine on
SSLCertificateFile /path/to/your/certificate.crt
SSLCertificateKeyFile /path/to/your/private.key
SSLCertificateChainFile /path/to/your/chainfile.pem
<Directory "/var/www/html">
RewriteEngine on
RewriteCond %{HTTP_HOST} !^domain\.com$
RewriteRule ^.*$ - [F,L]
</Directory>
</VirtualHost>
Contenuto pertinente
- Come posso copiare le regole dei firewall di Lightsail esistenti su istanze di Lightsail differenti?AWS UFFICIALEAggiornata 3 anni fa
- AWS UFFICIALEAggiornata 3 anni fa
- AWS UFFICIALEAggiornata 3 anni fa