You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With @nuxtjs/robots v4 and @nuxtjs/sitemap v6 and when setting the disallow option to '/' the sitemap is not being generated properly.
While I understand that a non indexable website shouldn't have a sitemap, in practice we are often forced to do things we shouldn't do in the first place.
My robots.config:
[
userAgent: '*',
disallow: '/' ,
]
the sitemap generated contains only one line:
http://mywebsite.com/
When downgrading to @nuxtjs/robots v3 and @nuxtjs/sitemap v5 I can set Disallow: / in the robots.txt and my sitemap is still generated as expected.
The text was updated successfully, but these errors were encountered:
This is actually an accidental side effect of how Nuxt Robots is built, as it won't generate the specified robots.txt rules if the site isn't indexable and we need to filter the sitemap URLs by the robot rules.
I could show all routes but then we have an issue where the non-indexable site may list URLs that aren't indexable on production, which leads to a confusing state as well.
Not too sure how to proceed, you could also deploy your app as a production build and then put a htpasswd in front of it if you need to emulate a production environment closer.
With @nuxtjs/robots v4 and @nuxtjs/sitemap v6 and when setting the disallow option to '/' the sitemap is not being generated properly.
While I understand that a non indexable website shouldn't have a sitemap, in practice we are often forced to do things we shouldn't do in the first place.
My robots.config:
the sitemap generated contains only one line:
When downgrading to @nuxtjs/robots v3 and @nuxtjs/sitemap v5 I can set
Disallow: /
in the robots.txt and my sitemap is still generated as expected.The text was updated successfully, but these errors were encountered: