Mar
25
What Is Robots.txt? A robots.txt file is a text file stored in a website’s root directory that gives web crawlers directions regarding which pages, folders and/or file types they should or shouldn’t access to crawl and index. These instructions can include all bots, or provide guidance to specific user-agents. Robots.txt files use the Robots Exclusion Protocol developed in 1994 as a protocol for websites communicating with crawlers and other internet bots. When website owners wants to tell bots how to crawl their sites, they load the robots.txt file in their...
Read More
Mar
25
What is XML Sitemap and How to Use one? An XML (Extensible Markup Language) Sitemap is a text file used to detail all URLs on a website. It can include extra information (metadata) on each URL, with details of when they were last updated, how important they are and whether there are any other versions of the URL created in other languages. All of this is done to help the search engines crawl your website more efficiently, allowing any changes to be fed to them directly, including when a new...
Read More
Mar
25
What are Title tags and How to Use them? Title tag represents the Title of a Html page documment.Title tags are writen in the <head> section of a HTML document. <head><title>This is My Title</title></head> Title tags are required elements in each HTML document. The title must be text-only, and it is shown in the browser's title bar or in the page's tab. Title tags are used by the search engines to determine what your page is about. So, try to make the title as accurate and meaningful as possible! Title tags are also...
Read More