I have a pretty complex sitemap setup for a site with tens of thousands of links. As such, it's divided up among multiple xml files, however my employer also requested that the sitemaps be organized under indexes such that they reflect the site's interaction flow. For example, my site will have a page url like:
https://www.example.com/page1
Where page1 has thousands of subpages. So what I did was I created an index specifically for links under page1 called page1-index.xml, and needs to be accessible under https://www.example.com/page1/page1-index.xml. Since there are thousands of links, they're also separated into multiple categories, with each category being its own sitemap linked in page1-index.xml.
So what I've got is an index at https://www.example.com/page1/page1-index.xml, and hundreds of sitemaps linked in that xml as https://www.example.com/page1/page1-category-n-sitemap.xml.
All the sitemap files are in a subfolder in the public folder with a tree that reflects the sitemap, so page1-index.xml is in public/sitemaps/page1/page1-index.xml so that a single route in my App.tsx can cover all sitemaps. Where I'm mainly confused is how exactly I should be returning these xmls in the router. I've got thousands of sitemaps so it's not practical for me to be submitting each individually to search engines. If I submit the indexes then the crawlers need to be able to access the individual sitemaps and that's where I'm uncertain on how to set it up. How should I set up a route or routes to return the xml files when the crawler accesses a link from the index?
question from:
https://stackoverflow.com/questions/65876748/setting-up-complex-sitemap-in-react-router-without-using-react-router-sitemap 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…