Folks use DA to see which Internet websites are trustworthy of their area and to determine where to have great inbound links. txt file is then parsed and will instruct the robotic regarding which pages aren't being crawled. To be a online search engine crawler could preserve a cached duplicate https://www.youtube.com/watch?v=7OfVkYHB1OY