Descripción
Virtual Robots.txt is an easy (i.e. automated) solution to creating and managing a robots.txt file for your site. Instead of mucking about with FTP, files, permissions ..etc, just upload and activate the plugin and you’re done.
Por defecto, el plugin Virtual Robots.txt admite el acceso a las partes de WordPress a las que los robots como Google necesitan acceder. Otras partes son bloqueadas.
If the plugin detects an existing XML sitemap file, a reference to it will be automatically added to your robots.txt file.
Instalación
- Sube la carpeta pc-robotstxt al directorio
/wp-content/plugins/
- Activa el plugin desde el menú ‘Plugins’ de WordPress
- Once you have the plugin installed and activated, you’ll see a new Robots.txt menu link under the Settings menu. Click that menu link to see the plugin settings page. From there you can edit the contents of your robots.txt file.
FAQ
-
Will it conflict with an existing robots.txt file?
-
If a physical robots.txt file exists on your site, WordPress won’t process any request for one, so there will be no conflict.
-
¿Funciona en instalaciones de WordPress en sub-directorios?
-
Out of the box, no. Because WordPress is in a sub-folder, it won’t «know» when someone is requesting the robots.txt file which must be at the root of the site.
-
¿Este plugin modifica entradas, páginas o categorías?
-
No, no lo hace.
-
¿Por qué bloquea el plugin ciertas carpetas y archivos por defecto?
-
Por defecto, el robots.txt virtual esta establecido para bloquear los archivos y carpetas de WordPress que no deben ser accesibles para los buscadores. Si no estás acuerdo con los ajustes por defecto, puedes cambiarlos fácilmente.
Reseñas
Colaboradores y desarrolladores
«Virtual Robots.txt» es un software de código abierto. Las siguientes personas han colaborado con este plugin.
Colaboradores«Virtual Robots.txt» está traducido en 1 idioma. Gracias a los traductores por sus contribuciones.
Traduce «Virtual Robots.txt» a tu idioma.
¿Interesado en el desarrollo?
Revisa el código , echa un vistazo al repositorio SVN o suscríbete al registro de desarrollo por RSS.
Registro de cambios
1.10
- Fix to prevent the saving of HTML tags within the robots.txt form field. Thanks to TrustWave for identifying this issue.
1.9
- Fix for PHP 7. Thanks to SharmPRO.
1.8
- Undoing last fixes as they had unintended side-effects.
1.7
- Further fixes to issue with newlines being removed. Thanks to FAMC for reporting and for providing the code fix.
- After upgrading, visit and re-save your settings and confirm they look correct.
1.6
- Fixed bug where newlines were being removed. Thanks to FAMC for reporting.
1.5
- Fixed bug where plugin assumed robots.txt would be at http when it may reside at https. Thanks to jeffmcneill for reporting.
1.4
- Fixed bug for link to robots.txt that didn’t adjust for sub-folder installations of WordPress.
- Updated default robots.txt directives to match latest practices for WordPress.
- Plugin development and support transferred to Marios Alexandrou.
1.3
- Now uses do_robots hook and checks for is_robots() in plugin action.
1.2
- Added support for existing sitemap.xml.gz file.
1.1
- Added link to settings page, option to delete settings.
1.0
- Versión inicial.