A new website's first obstacle is getting seen by the search engines. Bing and Yahoo crawl sites quickly after submission. However, it's Google that holds the record for most referral traffic, so having a new website indexed by the search engine giant is a priority for any new webmaster. Google recommends waiting at least a month for indexing, but some steps can be taken to speed up the process. Verify the Website in Webmaster Tools Google has an online application that provides website owners with statistics and index status. The website needs to be verified by placing an empty HTML page in the root directory of the domain. The HTML file name is an alphanumeric key generated in the Webmaster Tools application. After creating a Webmaster Tools account, verify you are the owner of the domain by generating the HTML file and uploading it to your website. Verification is instant after the page is uploaded. Create and Submit a Sitemap Sitemap.xml is the standard format that provides links for Googlebot to crawl. The sitemap is submitted to Google using the Webmaster Tools application. It may take Google several hours before the search engine crawls your sitemap for the first time, but subsequent crawls are automatic. A common webmaster mistake is assuming a crawled sitemap means the website is indexed. Although this step improves the chance for quicker index processing, it still may take several weeks to show in search results. Most website owners report an average of two weeks for indexing after the sitemap is submitted. Backlinks Backlinks are created when your domain link is posted on another website. These backlinks are used to calculate PageRank, which is a number that defines popularity of a website. Google guidelines indicate that a backlink needs to be natural. Natural backlinks are created by users who recommend your website to others online. Hiring marketing companies that spam paid links creates a devalued backlink. Paid links may create a higher PageRank in Google temporarily, but PageRank quickly drops after these links are devalued. Ensure your marketing practices for backlinks follow Google's guidelines. Validate Website Code If Googlebot is unable to crawl the website due to coding errors, the domain may not index. Typical coding errors for new webmasters are double HTML header tags, incorrect DOCTYPE at the top of the web page, and missing tags. These errors may not show in the user's browser, but they still pose a problem for Googlebot. Errors are detected by running the site through a validation website. Using validator.w3.org, type the domain into the text box and fix any errors detected. Provide Original and Unique Content Providing users with links to unique, informational content is the goal of Google's search engine. Websites completely comprised of links and duplicate content lead to a poor user experience, which defies Google's mission. Ensure the website verbiage is not duplicated, and update pages with unique text several times a month. Google tends to crawl websites more frequently when the content is updated regularly. Continue to enhance the website and verify optimization follows Google's guidelines. After the website is indexed, the real work begins with organic rank. Providing users with a good experience and practicing compliant search engine optimization ensure a well-ranked site. |
There are no comments for this item
Be the first to leave a comment
Login to leave a comment