The reason it makes sense to allow derivative works is that sometimes (not necessarily in this case, but we don't know in advance), things can change.
There are in fact some widely used extensions to the format described in the draft. Many search engines recognize "Sitemap:" with the location of a site map, and "Crawl-delay:" to slow down crawling. On the other hand, a lot don't recognize # $ * in sec 2.2.3 as special. I don't think we need to fix these to publish the draft, but it wouldn't be absurd to update it in light of experience.
I don't like to guess, but it could be that the authors some experience (or hearsay from friends) with another technology standardized in the IETF where there was a lot of discussion, and incorrectly generalized from that.
Someone already suggested moving the file to a different location, which considering that all robots.txt files have been in the same place for 25 years, seems like a bad idea.
Regards, John Levine, johnl@xxxxxxxxx, Taughannock Networks, Trumansburg NY Please consider the environment before reading this e-mail. https://jl.ly -- last-call mailing list last-call@xxxxxxxx https://www.ietf.org/mailman/listinfo/last-call