Uploaded image for project: '{RETIRED} XWiki Root Webapp'
  1. {RETIRED} XWiki Root Webapp
  2. XTROOT-2

Default robots.txt paths in web build

    XMLWordPrintable

Details

    • New Feature
    • Resolution: Fixed
    • Minor
    • 1.1
    • 1.0
    • None
    • all

    Description

      Because a web application is the best judge of which application paths are useful and appropriate for indexing by web crawlers, provision should be made to have the application's deployment set paths starting at its context root in the server's robots.txt file.

      There is, at present, no known standard for doing this, but we'll be looking for one, and creating one if there isn't one already.

      As an initial guess, probably just a list of paths to disallow (relative to the context root), to be read by the deployment process and used to replace the set of existing Disallow lines in robots.txt that start with the application's deployed context root, formatted appropriately. A sed script could do this on Unix machines.

      Attachments

        Activity

          People

            sdumitriu Sergiu Dumitriu
            macsentropy Brian M. Thomas
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: