If your site has been completed, then we have to input the site to http://www.google/submityour content /. Submit sitemap to Google Webmaster Tools. It is used Google to learn and know about the structure of the site that we have.
In to design the site, would be better if we create a site that is clear and text links. Each page must be familiar, at least with a text link. It would be better if the writing page, we write clearly and accurately describes the existing content. And besides, we harusmneggunakan "the users" who can get the information seekers memudahakan ktia site. The use of text penggunakaan akakn better than the picture, because Google Crawler can not detect text in images. If you still want to use images for text content, whether to use "ALT" to enter a few words of descriptive text. And it must be ensured that the elements and ALT attributes are accurate.
Lnk is broken and check the correct HTML to be done. If you want to use dynamic pages (which contains the character '?'). For images and video, there are important things to consider. Use a text browser such as Lynx to memriksa site that we have. And let the bot to search our website go to the sign ID or arguments that track their path through the site.
In
addition, we must ensure that our web server mendukuk If-Modified-Sine
HTTP header, which memungknkan our web server for Google memberigtahu
Konen whether we have changed since the last time the reader to enter
our website. This feature saves you bandwidth and overhead. Utilizing robots file. Txt on the web server we have. With menggunala this file will tell crawlers which directories can or can not be crawled. To
better understand how the machine will automatically bring into our
website, then can visit
http://code.google.com.web.controlcrawlindex/docs/faq.html. and to test it, ktia can use Google Webmaster Tools.
Advertisement in a web does not affect the rankings in the search. And to prevent crawling of pages pencariam then we can use the robots.txt.
Test site appears on different browsers
This is useful for monitoring and providing to the user-gasil relevant results than Google itself. Fast site search can also increase user satisfaction and improve the quality of overall we b. Google as a search engine recommends that webmasters can memonito site performance by using the "Page Speed. YSlow, WebPAgeTest, or other tools.
Guidelines for maintaining quality. These guidelines will make Goolg ememberikan negative response in case of fraudulent behavior arau other negative practices. If
we feel that other sites have been abusing the quality guidelines
(negative practice), then this can be reported to the
https://www.google.com/webmasters/tools/spamreport.
The basic principle in maintaining the quality of site.
In making the pages we should not deceive the user is often called "cloaking". And we also do not participate in link schemes designed to improve the ranking of sites / Page Rank. And
also avoid links to web spammers or "bad neighborhoods" on the web,
which will probably influence negattif member by that link. And
keep in mind that do not use unauthorized kompuer rogram to send the
page, rating, etc because Goole produck not recommend the use of such
Web Position, Gold that send automatic or programmatic to Google. There are some things that should be used as guidelines in maintaining quality, such as:
Avoid hidden text or links
Do not use cloaking or sneaky redirects
Do not send automated queries to Google
Do not load pages with irrelevant keywords
Do not create pages subdomains or domains with substantially duplicate content
Do not create pages with malicious behavior, such as phishing or installing viruses, Troja, or other badware
Pages
Avoid "doorway" created just for search engines or other "cookie
cutter" approaches such as affiliate programs with little or no real
contest there
reference: google.com
Tidak ada komentar:
Posting Komentar