I have multiple domains example-A.com
,example-B.com
andexample-C.com
that are all different, but they all point to the same folder on my server (The three domains are very closely connected but only administrated through a single Laravel application, for more see Multiple websites with one Laravel installation)
Now I need for each domain a different sitemap.xml
file.
I found in this answer how to execute PHP inside XML files. With that I could dynamically change the content of the sitemap.xml
with respect to the url
. Will this work for crawlers and SE if they try to access the XML file? Or is there a more recommended way of doing this?
via Chebli Mohamed
Aucun commentaire:
Enregistrer un commentaire