ویرگول
ورودثبت نام
فاطمه جهانشاهلو
فاطمه جهانشاهلو
خواندن ۳ دقیقه·۲ سال پیش

How to make txt bot file

robot.txt
robot.txt

robots.txt is designed to limit the access of search engine robots to the content of sites that work automatically. Search engines like Google, Yahoo and Microsoft etc. to find and index various web pages and existing sites. They use search bots known as spiders, bots or crawlers.

All standard robots on the Internet respect these rules and robots.txt restrictions and do not crawl or index pages. Meanwhile, spam bots do not pay attention to this file. Page encryption should be used to keep certain content safe from web bots.

The robots.txt file follows two commands and rules, which are:

  • Disallow defines the forbidden area
  • Allow free range to search
  • User agent scope of those whose search permission is blocked or unlocked

No special program is needed to create the robot file. You can use the same simple Windows Notepad or any other text editor that outputs a TXT file.

To create a robots.txt file, just create a new txt file. The file format or encoding must be UTF-8. Now open this file and write the necessary commands in it according to the instructions.

Upload the Robots file to the site:

The robots file must be located in the root. That is, right in the main folder of the website. This file should not be placed in a folder or directory.

You can easily view the Robots file of any website. Just add robots.txt/ to the end of any site URL and open it.

Testing the Robots file with the Google Search Console tool

If you have connected your website to the Google Search Console tool, when you open this test tool, it will ask you to select the desired connected site.

After selecting a website, you will be redirected to a page that displays the latest content of the Robots.txt file that Google has received and reviewed. You can edit the file on this page and then by pressing the submit button, a page will open.

On this page, you will see three buttons like the image below.

With the first button, you download the new Robots.txt file.

Now you have to place this file instead of the previous file on the host server.

After uploading, if you click the View uploaded version button, the new version will open.

At the end, by pressing the send button, ask Google to receive and check the new file. If you do these things successfully, the time and date of the bot's last file check will change to a time after the request. To be sure, you can use the same tool again to be sure.

This tool cannot directly edit the robots.txt file. After pressing the submit button, a window will open asking you to download the new edited file and replace it with the previous file on the website hosting server.

If you want to test specific pages, just enter its address in the bottom bar and then select the Google bot you want. Every time you click the test button, it will show you whether you have allowed robots to access the page or not.

For example, you can check if the Google Images bot has access to a certain page. You may have given the web bot access to the same page, but the image bot is not allowed to receive images and display them in search results.

filerobots txtربات گوگل
یک دختر با انرژی که عاشق یادگیری و نوشتن ه , هر روز شکوفه میدهم مرا به نظاره بنشین
شاید از این پست‌ها خوشتان بیاید