How to make a robots.txt file that increases site visibility
Step 1. Open Notepad
•
Open a default text editor program in your computer, e.g. Notepad.
•
Copy and paste the following to the Notepad.
User-agent: AdsBot-Google-Mobile
User-agent: AdsBot-Google
User-agent: Googlebot
Allow: /
Plain Text
복사
This means you are allowing Googlebot (desktop, mobile, and Search) to crawl your site
Step 2. Save File and Upload
•
After creating the robots.txt file, please save the file under the name of ‘robots.txt’.
•
Upload the file to root directory (root folder) of your site. Root directory is like a ‘parent(root) folder’ in folder structure. Q. In a root folder named ‘Dable,’ we have three sub folders – Company A, Company B, and Company C. In this case, what is the root directory of this site? A. The answer is – ‘Dable’.
•
Often, root directory has public_html, html, www, wwwroot, and htdocs as its name. If you cannot find any folder with these names, please look for a folder that has an index file (index.php, index.html, default.html) in it. Then upload the robots.txt file in that folder (note that it could be an html folder under www). You’re all set!
Step 3. Check and Confirm.
•
For example, if your website address is http://www.dable.io and you can see all the things you have entered in the Notepad when you type http://www.dable.io/robots.txt in the address bar, then you can confirm your robots.txt file has been successfully uploaded.
•
For example, if your website address is www.dable.io and you can see all the things you have entered in the Notepad when you type www.dable.io/robots.txt in the address bar, then you can confirm your robots.txt file has been successfully uploaded.
•
There should be only one robots.txt file for each site.
We hope these tips will be helpful in enjoying a successful advertisement experience through Dable! We will always strive to provide better information to help you increase profits.