Very often it happens that very popular sites are not so much quality, as well untwisted. In fact, it is no any paradox, but is a normal pattern. We just need to take time and money on promotion, because if at least one of these two factors will be deprived of attention, the success of the existence of the site is doomed. This entry we started to the fact that now I want to talk not about the qualitative side of the issue, and on so to speak, quantitative.
The method, which will be discussed in this article are suitable for a very limited range of people. Since Not all our strength and patience to bring everything to its logical conclusion, but the method has the right to exist.
The idea is this: create one or several very large sites, they placed banners and banner displays the accumulated used to advertise the main site.
First, determine what we need:
1. The main, high-quality site (it can be quite small).
2. A few minor sites with elements of optimizing for search engines.
3. A lot of efforts and patience.
It should be a few (more - the better), the major content sites, any topic (except erotica and porn). The site should be a lot of textual information, it is desirable to make all this more and look prilichno.Eto all be sites - donors.
Someone may ask: "But the donor sites also need to unwind, to banner displays on them to save money" - is true, but not quite. At full strength promotion spending is not necessary, because much time and effort it will take to create these sites. All hope for the search engines, ie after the creation of sites, we register them in the search engines. Sooner or later they will be indexed. For a successful and complete indexing makes the card sites, ie page with links to all (or part) of other pages.
Traffic is expected for the following reasons:
1. On 1000 indexed pages are not optimized have 30-40 visitors from search engines.
2. Pages will contain elements of optimization, so we can count on more visitors.
It is not difficult to calculate that, having a site with 10.000 pages, every day we can expect more than 300 unique visitors from search engines. If such sites made 10, the total traffic of over 3000 unique visitors. Suppose each user is viewing two pages on each page 3 of the banner, hence, on a daily basis turn off 18.000 banner impressions. Ie 1 month we will be able to accumulate more 540.000 banner impressions, excluding commission banner network. These impressions will be spent on advertising the main site.
As a result, we got good numbers. But at what price! Imagine what we would like 100.000 pages indexed and still achieve their search engine. It is very difficult. The problem could be alleviated by using the existing database which some texts and software. Just say: I do not, so do not ask.
Related Documents
http://arjudba.blogspot.com/2009/10/seo-link-building-directory-article.html
http://arjudba.blogspot.com/2009/11/affordable-search-engine-optimization.html
http://arjudba.blogspot.com/2010/05/tips-to-build-search-engine-optimized.html
http://arjudba.blogspot.com/2010/05/search-engine-promotion-terminology.html
http://arjudba.blogspot.com/2010/05/tips-to-create-successful-website-for.html
http://arjudba.blogspot.com/2010/05/search-methodology-finding.html
http://arjudba.blogspot.com/2010/05/pay-per-click-thinking-and-thoughts.html
http://arjudba.blogspot.com/2010/05/introduction-to-search-directories.html
Monday, May 3, 2010
Introduction to search directories
Catalogs/Directories - are places where users can find sites they need, just as they find them using search engines. But there are significant differences between directories and search engines. For those who are going to promote her site, it is vital to understand these differences.
While search engines make almost any site, without the quality requirements, catalogs, as a rule, impose requirements for the quality and content of the site. Since In the largest and best known of this site are checked by people, the low quality sites do not fall into the database. On this basis we can conclude before registering our site in directories, make sure that it is ready. The catalogs are registered, usually, only the front page (another contrast to search engines).
Where to register:
If we've done, it means that our site is ready. But the readiness of the site as such, does not mean that he is ready for registration in the catalogs.
First, how and where to register. We need to set a goal: we need to get at least 3 directories (by the way, this is a requirement for a successful website optimization is a necessary and sufficient) to have a nice extra (and maybe he will and basic, it all depends on the literacy optimization) influx of visitors. But as we already understood before signing to optimize our site for all together. It should be tailored to suit each of them to achieve good positions. On the singularities of each of them we can learn in detail in the sections devoted to each directory specified, but be sure to read this story, because in these directories are common features inherent in all and about which we need to know.
Suppose we spend a lot of time searching for the right section, but the effect will be much more noticeable, because the more general sections of more sites and therefore we will be hard to attract the user, especially if he's looking for something on this subject but it certainly does not represent that he needs. Since they who know exactly what to look sure to be found in the more precise and narrow sections. We can argue that in the more general topics and more visitors and hence more potential traffic. And here we are wrong, because in narrower sections is also quite a few visitors (these directories are not just popular, but over popular), and the site is much smaller and more focused audience.
How to register.
Almost all directories require us to registration: the name of the site, its description. Site name or site title, this is a very important characteristic of this we have to work hard. Write a short, but at the same time, well characterized and attractive description, consisting of an example of 15-20 words. Look at the descriptions of our competitors in this section, draw conclusions, they can even be used as a model. Try to keep our best keyword is included in the description, but do not fill in all the description of such words, for goodness this never leads. The editor can he write a description of our site, if it seems that our description does not correspond to reality, he would write his own and in the description editor, our keyword may not meet even once. Or (as usually done) simply reject our application for registration. If we think that the description that we entered when registering bad, we can write a letter to the editor and ask him to change, maybe our request will be granted. To trace how we enter information about the site, check out all that is written for errors, especially in the site URL. Because one mistake (or typo) in the URL of the site, negate all our efforts.
That's probably in general, and all that can be said about the directories.
Related Documents
http://arjudba.blogspot.com/2009/10/seo-link-building-directory-article.html
http://arjudba.blogspot.com/2009/11/affordable-search-engine-optimization.html
http://arjudba.blogspot.com/2010/05/tips-to-build-search-engine-optimized.html
http://arjudba.blogspot.com/2010/05/search-engine-promotion-terminology.html
http://arjudba.blogspot.com/2010/05/tips-to-create-successful-website-for.html
http://arjudba.blogspot.com/2010/05/search-methodology-finding.html
http://arjudba.blogspot.com/2010/05/pay-per-click-thinking-and-thoughts.html
While search engines make almost any site, without the quality requirements, catalogs, as a rule, impose requirements for the quality and content of the site. Since In the largest and best known of this site are checked by people, the low quality sites do not fall into the database. On this basis we can conclude before registering our site in directories, make sure that it is ready. The catalogs are registered, usually, only the front page (another contrast to search engines).
Where to register:
If we've done, it means that our site is ready. But the readiness of the site as such, does not mean that he is ready for registration in the catalogs.
First, how and where to register. We need to set a goal: we need to get at least 3 directories (by the way, this is a requirement for a successful website optimization is a necessary and sufficient) to have a nice extra (and maybe he will and basic, it all depends on the literacy optimization) influx of visitors. But as we already understood before signing to optimize our site for all together. It should be tailored to suit each of them to achieve good positions. On the singularities of each of them we can learn in detail in the sections devoted to each directory specified, but be sure to read this story, because in these directories are common features inherent in all and about which we need to know.
Suppose we spend a lot of time searching for the right section, but the effect will be much more noticeable, because the more general sections of more sites and therefore we will be hard to attract the user, especially if he's looking for something on this subject but it certainly does not represent that he needs. Since they who know exactly what to look sure to be found in the more precise and narrow sections. We can argue that in the more general topics and more visitors and hence more potential traffic. And here we are wrong, because in narrower sections is also quite a few visitors (these directories are not just popular, but over popular), and the site is much smaller and more focused audience.
How to register.
Almost all directories require us to registration: the name of the site, its description. Site name or site title, this is a very important characteristic of this we have to work hard. Write a short, but at the same time, well characterized and attractive description, consisting of an example of 15-20 words. Look at the descriptions of our competitors in this section, draw conclusions, they can even be used as a model. Try to keep our best keyword is included in the description, but do not fill in all the description of such words, for goodness this never leads. The editor can he write a description of our site, if it seems that our description does not correspond to reality, he would write his own and in the description editor, our keyword may not meet even once. Or (as usually done) simply reject our application for registration. If we think that the description that we entered when registering bad, we can write a letter to the editor and ask him to change, maybe our request will be granted. To trace how we enter information about the site, check out all that is written for errors, especially in the site URL. Because one mistake (or typo) in the URL of the site, negate all our efforts.
That's probably in general, and all that can be said about the directories.
Related Documents
http://arjudba.blogspot.com/2009/10/seo-link-building-directory-article.html
http://arjudba.blogspot.com/2009/11/affordable-search-engine-optimization.html
http://arjudba.blogspot.com/2010/05/tips-to-build-search-engine-optimized.html
http://arjudba.blogspot.com/2010/05/search-engine-promotion-terminology.html
http://arjudba.blogspot.com/2010/05/tips-to-create-successful-website-for.html
http://arjudba.blogspot.com/2010/05/search-methodology-finding.html
http://arjudba.blogspot.com/2010/05/pay-per-click-thinking-and-thoughts.html
Pay per click thinking and thoughts
We are living in the shallow and the warmth Runlet, writing letters, seeking information, and occasionally creating our own content projects, often invisible and unknown depths of trends, in fact, shape the face of the modern Internet. So far we notice them when they float to the surface - for example, when a confrontation spammers and antispam percentage of letters that reach the recipients, catastrophic fall and people will again begin to use faxes (!).
Or when it turns out that as a result of the optimizer for a number of queries on the first pages of search engines can not find anything other than commercial offers. And more and more rankings begins to be determined advertising (in this case - optimization) budget.
Doorway - especially as the rule is automatically created page is optimized for a specific clause or group of queries and is used solely in order to entice a visitor from a search engine and then transport it to the target site. Most doorways are placed on free hosting or a domain of the third or fourth level. A person coming to this page, doorway, as a rule, automatically spreads to the landing page - in this case, the catalog page reseller traffic
Advertisers placing ads through it and get paid when visitors click through on ads some sites, moving with the advertiser's site. If its an advertising site or not is not enough (and it is never enough - even "Google"), the system begins to attract partners - sites where the same broadcast advertisements, possibly correlating with the theme of the Web sites and content pages. And the PPC-system is divided with a partner income received from advertisers. As long as we see, there is nothing wrong. Partners will receive the money, PPC-system receives its commission, advertisers are inexpensive and targeted traffic (visitors). Everybody is happy. But there is another link - resellers traffic. The fact that the usual PPC-system anyone in the partners do not take. This should be a real site with a non-zero traffic. So, resellers traffic (typically catalogs, wholly or mainly composed of PPC-boards), as a partner of PPC-systems, expanding our partner program, sharing revenue with those already infinitely small sites that "blow up" on their traffic .
And here comes a real disaster. Because the partners are "second level" is the very doorway pages that are already pretty foul of search engine results, and will continue to foul it even stronger.
Due to the simplicity and complete automating technologies, doorway pages can generate hundreds and thousands. So, in fact, happens - only manage to invent queries.
Why? It's very simple. If every doorway will host a dollar a month, then multiply it by a thousand. Or 10,000.
We can say that automatically generated doorway pages are often more relevant in terms of search engines, rather than "normal" sites that are not optimized specially.
What will happen?
Extradition on a high-frequency and / or commercially attractive queries "occupy" optimized site sellers. And the sellers are not limited to optimization of one site, but the fruit of dozens of similar sites, not just trying to get a good place to extradite, and the "hammer" the entire first page of search results (better - the first few pages), only their proposals.
But in general it does not change because of technology generation doorways too, not standing still. And under the existing criteria of relevance (content pages + index options) to solve this problem, it seems impossible in principle.
Where is the exit? Apparently, only in the recognition of the fact that "artificial intelligence" search engine could not resist a collision with a natural intelligence webmasters and SEO experts in the treatment of human estimated relevance.
Judging by recent statements, precisely in this direction began to move Yahoo!, Planning to radically improve the search by using its 90-million-user community. Let's see what happens.
Related Documents
http://arjudba.blogspot.com/2009/10/seo-link-building-directory-article.html
http://arjudba.blogspot.com/2009/11/affordable-search-engine-optimization.html
http://arjudba.blogspot.com/2010/05/tips-to-build-search-engine-optimized.html
http://arjudba.blogspot.com/2010/05/search-engine-promotion-terminology.html
http://arjudba.blogspot.com/2010/05/tips-to-create-successful-website-for.html
http://arjudba.blogspot.com/2010/05/search-methodology-finding.html
Or when it turns out that as a result of the optimizer for a number of queries on the first pages of search engines can not find anything other than commercial offers. And more and more rankings begins to be determined advertising (in this case - optimization) budget.
Doorway - especially as the rule is automatically created page is optimized for a specific clause or group of queries and is used solely in order to entice a visitor from a search engine and then transport it to the target site. Most doorways are placed on free hosting or a domain of the third or fourth level. A person coming to this page, doorway, as a rule, automatically spreads to the landing page - in this case, the catalog page reseller traffic
Advertisers placing ads through it and get paid when visitors click through on ads some sites, moving with the advertiser's site. If its an advertising site or not is not enough (and it is never enough - even "Google"), the system begins to attract partners - sites where the same broadcast advertisements, possibly correlating with the theme of the Web sites and content pages. And the PPC-system is divided with a partner income received from advertisers. As long as we see, there is nothing wrong. Partners will receive the money, PPC-system receives its commission, advertisers are inexpensive and targeted traffic (visitors). Everybody is happy. But there is another link - resellers traffic. The fact that the usual PPC-system anyone in the partners do not take. This should be a real site with a non-zero traffic. So, resellers traffic (typically catalogs, wholly or mainly composed of PPC-boards), as a partner of PPC-systems, expanding our partner program, sharing revenue with those already infinitely small sites that "blow up" on their traffic .
And here comes a real disaster. Because the partners are "second level" is the very doorway pages that are already pretty foul of search engine results, and will continue to foul it even stronger.
Due to the simplicity and complete automating technologies, doorway pages can generate hundreds and thousands. So, in fact, happens - only manage to invent queries.
Why? It's very simple. If every doorway will host a dollar a month, then multiply it by a thousand. Or 10,000.
We can say that automatically generated doorway pages are often more relevant in terms of search engines, rather than "normal" sites that are not optimized specially.
What will happen?
Extradition on a high-frequency and / or commercially attractive queries "occupy" optimized site sellers. And the sellers are not limited to optimization of one site, but the fruit of dozens of similar sites, not just trying to get a good place to extradite, and the "hammer" the entire first page of search results (better - the first few pages), only their proposals.
But in general it does not change because of technology generation doorways too, not standing still. And under the existing criteria of relevance (content pages + index options) to solve this problem, it seems impossible in principle.
Where is the exit? Apparently, only in the recognition of the fact that "artificial intelligence" search engine could not resist a collision with a natural intelligence webmasters and SEO experts in the treatment of human estimated relevance.
Judging by recent statements, precisely in this direction began to move Yahoo!, Planning to radically improve the search by using its 90-million-user community. Let's see what happens.
Related Documents
http://arjudba.blogspot.com/2009/10/seo-link-building-directory-article.html
http://arjudba.blogspot.com/2009/11/affordable-search-engine-optimization.html
http://arjudba.blogspot.com/2010/05/tips-to-build-search-engine-optimized.html
http://arjudba.blogspot.com/2010/05/search-engine-promotion-terminology.html
http://arjudba.blogspot.com/2010/05/tips-to-create-successful-website-for.html
http://arjudba.blogspot.com/2010/05/search-methodology-finding.html
Search Methodology Finding
After watching thirty users worldwide network, performing target search of sites, was found an interesting phenomenon:
The more attempts to take people to search for interesting material, the less likely a successful outcome.
The data speak for themselves: those who have made one attempt found the necessary information in 55% of cases, while users who sought twice, succeeded in 38% of cases.
Less than 25% of users took more than two attempts to search. As for those who still persisted and were looking for more than two times - they have not received the positive results.
Principal reason that people are trying to find more than once - to receive the information they type - "not found" or "do not have options." Most users will abandon further attempts to find as soon get similar responses. However, many still try to "search of happiness" for the second time.
When we first try to find 23% of users were the result of "not found".
Of those who continued to seek, 44% received a negative result for the second time.
Of those who continue to stubbornly, 50% for the third time did not find the required information.
Not helped perseverance and more highly motivated users - to 100% of them every fourth attempt failed.
Theoretically, when using a search engine, people improve their skills to work with her. In the end, every successful attempt - is a moment of cognition, the fact that helps to explore the features of this tool.
However, in practice it is not so. The fact that the useful tips designed to attract users to further search, in fact does not help. Many sites are placed such tips on the pages marked as "not found", so that users looking for other ways to search. Unfortunately, the existence of such "useful" recommendations do not increases the chances that the next attempt will be successful.
It is important to note that the methodology of the study to be quite significant. - People were intelligently directed to sites with content that interests them. As a result, one of the five who sought the information received the message "not found" at the first attempt, indicating that the fundamental errors in the design of the site.
Concludes: the key to success of the developers site is making Internet users to achieve desired results at the first attempt to search. We can say that the sites have reached this level, success is guaranteed.
Related Documents
http://arjudba.blogspot.com/2009/10/seo-link-building-directory-article.html
http://arjudba.blogspot.com/2009/11/affordable-search-engine-optimization.html
http://arjudba.blogspot.com/2010/05/tips-to-build-search-engine-optimized.html
http://arjudba.blogspot.com/2010/05/search-engine-promotion-terminology.html
http://arjudba.blogspot.com/2010/05/tips-to-create-successful-website-for.html
The more attempts to take people to search for interesting material, the less likely a successful outcome.
The data speak for themselves: those who have made one attempt found the necessary information in 55% of cases, while users who sought twice, succeeded in 38% of cases.
Less than 25% of users took more than two attempts to search. As for those who still persisted and were looking for more than two times - they have not received the positive results.
Principal reason that people are trying to find more than once - to receive the information they type - "not found" or "do not have options." Most users will abandon further attempts to find as soon get similar responses. However, many still try to "search of happiness" for the second time.
When we first try to find 23% of users were the result of "not found".
Of those who continued to seek, 44% received a negative result for the second time.
Of those who continue to stubbornly, 50% for the third time did not find the required information.
Not helped perseverance and more highly motivated users - to 100% of them every fourth attempt failed.
Theoretically, when using a search engine, people improve their skills to work with her. In the end, every successful attempt - is a moment of cognition, the fact that helps to explore the features of this tool.
However, in practice it is not so. The fact that the useful tips designed to attract users to further search, in fact does not help. Many sites are placed such tips on the pages marked as "not found", so that users looking for other ways to search. Unfortunately, the existence of such "useful" recommendations do not increases the chances that the next attempt will be successful.
It is important to note that the methodology of the study to be quite significant. - People were intelligently directed to sites with content that interests them. As a result, one of the five who sought the information received the message "not found" at the first attempt, indicating that the fundamental errors in the design of the site.
Concludes: the key to success of the developers site is making Internet users to achieve desired results at the first attempt to search. We can say that the sites have reached this level, success is guaranteed.
Related Documents
http://arjudba.blogspot.com/2009/10/seo-link-building-directory-article.html
http://arjudba.blogspot.com/2009/11/affordable-search-engine-optimization.html
http://arjudba.blogspot.com/2010/05/tips-to-build-search-engine-optimized.html
http://arjudba.blogspot.com/2010/05/search-engine-promotion-terminology.html
http://arjudba.blogspot.com/2010/05/tips-to-create-successful-website-for.html
Tips to create a successful website for Google
Quite often, an opinion that, in the articles is too much to say about the theory of the development of successful websites. Well, let’s skip the theory and turn to the time-tested methods. Next the system with a 100% probability to achieve the desired position in Google for a wide range of queries. These are the techniques that we use. Results are usually on the theme, target audience and the level of competition in the niche. The following will build a successful site in Google for one year. However, we can meet in a short period of time - if we really decide to try.
A) Begin building content- Before we even choose a domain name for the site, check for ourself the following - we must have 100 pages. That's just to start. This is only a page with real content - not lists of links, opening page, and something like that.
B) Domain name - easy to remember and meaningful. Do not embed keywords - we need to create a brand, brand name, which will be easy to remember. Time domain of the keywords passed. Learn the lesson of GoTo.com, which recently became Overture.com - in my opinion, this was one of the best examples of a brand on the Internet, which, incidentally, required discarding entire years spent on the creation of another brand.
C) Site Design - The simpler the better. Text must be greater than the html content. Pages should be visible in any browser. Stay away from heavy weights on the page - Flash, Java, JavaScript - they tend to do little to the site, but can seriously damage a variety of reasons, and the search engines dislike them, only one of them.
Build a website in a logical manner. Directory names search terms that we want to hit. Can we do anything else and just throw everything in the root directory - in spite of contrary advice; it works well on many search engines, and including on Google. Refrain from unnecessary options that can clog the site, for example, "Best viewed with", counters, buttons, etc. Make it simple and professional looking. Learn the lesson of the Google - simplicity - this is what surfers want.
Download speed - that's not all. our site should respond almost instantly. If we go to the site in the browser, nothing happens for 3-4 seconds - we have a problem. This time may vary depending on the location of the server, but the site is located in our country, should respond within 3-4 seconds. Longer than that - and we'll lose the audience, 10% for every second. Meanwhile, 10% could be the difference between success and failure.
Pages:
D) Page Size - The smaller the better. Try not to exceed 15 KB. The smaller the better. Try not to exceed 12 KB. The smaller the better. Try not to exceed 10 KB. Idea clear? To keep within the range of 5 to 10 KB. Yes, it is difficult to do - but it is possible and it works. As for the search engines and visitors.
E) Content - Build one page of content and put on 200-250 words a day. If we do not know what should be on the page - use the service Overture. The resulting list - is the core of our page, the starting line.
F) Density, position, etc. - A simple, old fashioned enough. Use the keyword once in title, description, tag, H1, text links, bold, italics, at the beginning of the page. Try to maintain the frequency of use of a keyword in the range of 5 to 20%. Use nice phrases and check their spelling. Search engines are increasingly used for an automatic adjustment requests and there is no reason to neglect this.
G) External links - Put on every page of links to one or two sites that are well located to fit your needs. Use these requests to the text links - it would be very useful in the future.
Related Documents
http://arjudba.blogspot.com/2009/10/seo-link-building-directory-article.html
http://arjudba.blogspot.com/2009/11/affordable-search-engine-optimization.html
http://arjudba.blogspot.com/2010/05/tips-to-build-search-engine-optimized.html
http://arjudba.blogspot.com/2010/05/search-engine-promotion-terminology.html
A) Begin building content- Before we even choose a domain name for the site, check for ourself the following - we must have 100 pages. That's just to start. This is only a page with real content - not lists of links, opening page, and something like that.
B) Domain name - easy to remember and meaningful. Do not embed keywords - we need to create a brand, brand name, which will be easy to remember. Time domain of the keywords passed. Learn the lesson of GoTo.com, which recently became Overture.com - in my opinion, this was one of the best examples of a brand on the Internet, which, incidentally, required discarding entire years spent on the creation of another brand.
C) Site Design - The simpler the better. Text must be greater than the html content. Pages should be visible in any browser. Stay away from heavy weights on the page - Flash, Java, JavaScript - they tend to do little to the site, but can seriously damage a variety of reasons, and the search engines dislike them, only one of them.
Build a website in a logical manner. Directory names search terms that we want to hit. Can we do anything else and just throw everything in the root directory - in spite of contrary advice; it works well on many search engines, and including on Google. Refrain from unnecessary options that can clog the site, for example, "Best viewed with", counters, buttons, etc. Make it simple and professional looking. Learn the lesson of the Google - simplicity - this is what surfers want.
Download speed - that's not all. our site should respond almost instantly. If we go to the site in the browser, nothing happens for 3-4 seconds - we have a problem. This time may vary depending on the location of the server, but the site is located in our country, should respond within 3-4 seconds. Longer than that - and we'll lose the audience, 10% for every second. Meanwhile, 10% could be the difference between success and failure.
Pages:
D) Page Size - The smaller the better. Try not to exceed 15 KB. The smaller the better. Try not to exceed 12 KB. The smaller the better. Try not to exceed 10 KB. Idea clear? To keep within the range of 5 to 10 KB. Yes, it is difficult to do - but it is possible and it works. As for the search engines and visitors.
E) Content - Build one page of content and put on 200-250 words a day. If we do not know what should be on the page - use the service Overture. The resulting list - is the core of our page, the starting line.
F) Density, position, etc. - A simple, old fashioned enough. Use the keyword once in title, description, tag, H1, text links, bold, italics, at the beginning of the page. Try to maintain the frequency of use of a keyword in the range of 5 to 20%. Use nice phrases and check their spelling. Search engines are increasingly used for an automatic adjustment requests and there is no reason to neglect this.
G) External links - Put on every page of links to one or two sites that are well located to fit your needs. Use these requests to the text links - it would be very useful in the future.
Related Documents
http://arjudba.blogspot.com/2009/10/seo-link-building-directory-article.html
http://arjudba.blogspot.com/2009/11/affordable-search-engine-optimization.html
http://arjudba.blogspot.com/2010/05/tips-to-build-search-engine-optimized.html
http://arjudba.blogspot.com/2010/05/search-engine-promotion-terminology.html
Tips to build an Search Engine Optimized Website
When a visitor comes to our site by clicking a search engine, whether it is something intrigue, in order to continue the navigation on our site they asked a question? Generally, as the search engines evaluate our site? They do not take just index.html, and comb all the links. So all our pages should be informative enough - otherwise the user can leave without finding what he needed, even if we have it. The idea is that we get visitors through all the existing pages on our site.
In other words, each page should be a mini-site. To achieve this is not too difficult, just look at our index.html. Create a navigation menu and paste it into every page. On each page have to be a reference to the main, to other key site pages, e-mail address. If our site consists of several hundred pages, limit to two dozen key.
Consider the page that will open in new windows, and are intended primarily for displaying large graphic files or small text fragments. On these pages, just put a link to the main. Some believe that putting in the code page Tag "no index" it is not indexed. In fact, some search engines have to ignore this tag, indexing all the pages from and to.
Reassess every page, word by word, making each a separate set of keywords, phrases. Identify them is not hard enough to see what the words are often repeated in the text of the page. If we still find it difficult to draw up key words and phrases, then go the opposite method: create a page for keywords.
Keywords should be placed in the text throughout the page, from top to bottom. Three or four words at the top, but not in any way not in a row, otherwise the search engines blacklisted, and a pair in the middle and bottom of the page. Place them among the other text so that they occur in each sentence.
Example header:
"Web site promotion. How to untwist the site using the banner exchange networks. Tips for website promotion from scratch."
Remember that the search engine robot works on algorithms, and should not be on the page of 25 words to repeat the keyword 7 times.
Here are places in the page where we can safely insert the keyword.
1. URL of the page.
2. Tag "Title"
3. Use keywords in image file name.
4. Alt-tag in the image. It not only helps visitors to understand the meaning of the figure, but also indexed by search engines.
5. When using the java-script when opening a new window.
6. Each time a keyword is in the h1 tag page valued more highly. Further, when the first line on the web-page.
As we see, is not so hard to create quality pages for search engines. And if we do not like the statistics of visits to the site, start to redevelop the site right now. Type a keyword in any search engine and analyze the html-codes the first 10 pages of search results. Look at all the details of which I mentioned here, and we'll be able to identify patterns.
Related Documents
http://arjudba.blogspot.com/2009/10/seo-link-building-directory-article.html
http://arjudba.blogspot.com/2009/11/affordable-search-engine-optimization.html
In other words, each page should be a mini-site. To achieve this is not too difficult, just look at our index.html. Create a navigation menu and paste it into every page. On each page have to be a reference to the main, to other key site pages, e-mail address. If our site consists of several hundred pages, limit to two dozen key.
Consider the page that will open in new windows, and are intended primarily for displaying large graphic files or small text fragments. On these pages, just put a link to the main. Some believe that putting in the code page Tag "no index" it is not indexed. In fact, some search engines have to ignore this tag, indexing all the pages from and to.
Reassess every page, word by word, making each a separate set of keywords, phrases. Identify them is not hard enough to see what the words are often repeated in the text of the page. If we still find it difficult to draw up key words and phrases, then go the opposite method: create a page for keywords.
Keywords should be placed in the text throughout the page, from top to bottom. Three or four words at the top, but not in any way not in a row, otherwise the search engines blacklisted, and a pair in the middle and bottom of the page. Place them among the other text so that they occur in each sentence.
Example header:
"Web site promotion. How to untwist the site using the banner exchange networks. Tips for website promotion from scratch."
Remember that the search engine robot works on algorithms, and should not be on the page of 25 words to repeat the keyword 7 times.
Here are places in the page where we can safely insert the keyword.
1. URL of the page.
2. Tag "Title"
3. Use keywords in image file name.
4. Alt-tag in the image. It not only helps visitors to understand the meaning of the figure, but also indexed by search engines.
5. When using the java-script when opening a new window.
6. Each time a keyword is in the h1 tag page valued more highly. Further, when the first line on the web-page.
As we see, is not so hard to create quality pages for search engines. And if we do not like the statistics of visits to the site, start to redevelop the site right now. Type a keyword in any search engine and analyze the html-codes the first 10 pages of search results. Look at all the details of which I mentioned here, and we'll be able to identify patterns.
Related Documents
http://arjudba.blogspot.com/2009/10/seo-link-building-directory-article.html
http://arjudba.blogspot.com/2009/11/affordable-search-engine-optimization.html
Search Engine Promotion Terminology
Boosts ranking
The method of indexing, in which the site gets not only in the index search engine, but in a thematic catalog.
Bridge-paging
The method of search engine promotion of the site with doorway and hallway pages.
Crawling
This term is denoted by the way, which is used by search engine, raising our site in its index. For example, deep crawling, instant indexing, etc.
Description
Description. Brief description to a specific page or the entire site.
Doorway page
Page created in order to come out on top in search engines to index. Built with an eye to the rules of search engines and can not move away from the site.
Frequency
Frequency. It is used only in conjunction with others, such as keyword frequency - the maximum frequency of repetition of key words on the page.
Hallway page
The page contains links to all the doorway pages. Application hallway pages can significantly reduce the time indexing of our site.
Index
A database search system, which stores all the information gathered by search engine robot.
Internet Catalog
Internet directory. Links to sites, divided by categories. Typically, each section follows a special person, so all sites got in the catalog, were of good content.
Keyword
Key words. Words that are more closely reflect the themes page.
Link Club
Club sites. Commonwealth sites of certain subjects. All the club members put on each other cross-references.
Meta
HTML tags that are used for indexing, almost all search engines.
QQW
Index Queries. Indicator specific keyword. Calculated by the formula, the number of requests * (number of requests / number of pages found on request).
Query
Unique user query to the search system. Terms can be as a separate word or short phrase.
Ranking Determination of site positions in the index. For example, if the two are absolutely identical in textual content, sites to create a different HTML content, they are one and the same query will appear in different places.
Relative
Relativity. The notional value of showing the effectiveness of indexing site.
REP
Robot Exclusion Protocol. Protocol Exceptions robots used for passive work with search engines.
Robot
Program search engine, which runs on the links and index information from Web pages. The results of the robot can be found in any search engine.
Search Engine
Search engine. The set of hardware and software that allows to find links to files on users' queries.
Site mirror
Mirror site. The search engine may refuse indexing of the mirror copies of the site.
Spamdexing
Standard test search robot, web pages, the use of prohibited techniques (hidden text, refresh, etc.).
Stemming
Ability to entice the search engine keywords. For example, the query "site", as will be found the word "site", "site", "site", etc.
Stop words
Stop words. Some search engines do not appear in their indexes and not allowed to search for particular words. Most often it is widely used words like "and", "a", "the", "www". This is to save time when searching for a query.
Submit
Note to the index page in the search engine.
Web Ring
Ring sites. Commonwealth of sites with the same topic. All members of the club put on our site a little navigation on the ring. On it the user can move to the next, previous and main sites of the ring.
The method of indexing, in which the site gets not only in the index search engine, but in a thematic catalog.
Bridge-paging
The method of search engine promotion of the site with doorway and hallway pages.
Crawling
This term is denoted by the way, which is used by search engine, raising our site in its index. For example, deep crawling, instant indexing, etc.
Description
Description. Brief description to a specific page or the entire site.
Doorway page
Page created in order to come out on top in search engines to index. Built with an eye to the rules of search engines and can not move away from the site.
Frequency
Frequency. It is used only in conjunction with others, such as keyword frequency - the maximum frequency of repetition of key words on the page.
Hallway page
The page contains links to all the doorway pages. Application hallway pages can significantly reduce the time indexing of our site.
Index
A database search system, which stores all the information gathered by search engine robot.
Internet Catalog
Internet directory. Links to sites, divided by categories. Typically, each section follows a special person, so all sites got in the catalog, were of good content.
Keyword
Key words. Words that are more closely reflect the themes page.
Link Club
Club sites. Commonwealth sites of certain subjects. All the club members put on each other cross-references.
Meta
HTML tags that are used for indexing, almost all search engines.
QQW
Index Queries. Indicator specific keyword. Calculated by the formula, the number of requests * (number of requests / number of pages found on request).
Query
Unique user query to the search system. Terms can be as a separate word or short phrase.
Ranking Determination of site positions in the index. For example, if the two are absolutely identical in textual content, sites to create a different HTML content, they are one and the same query will appear in different places.
Relative
Relativity. The notional value of showing the effectiveness of indexing site.
REP
Robot Exclusion Protocol. Protocol Exceptions robots used for passive work with search engines.
Robot
Program search engine, which runs on the links and index information from Web pages. The results of the robot can be found in any search engine.
Search Engine
Search engine. The set of hardware and software that allows to find links to files on users' queries.
Site mirror
Mirror site. The search engine may refuse indexing of the mirror copies of the site.
Spamdexing
Standard test search robot, web pages, the use of prohibited techniques (hidden text, refresh, etc.).
Stemming
Ability to entice the search engine keywords. For example, the query "site", as will be found the word "site", "site", "site", etc.
Stop words
Stop words. Some search engines do not appear in their indexes and not allowed to search for particular words. Most often it is widely used words like "and", "a", "the", "www". This is to save time when searching for a query.
Submit
Note to the index page in the search engine.
Web Ring
Ring sites. Commonwealth of sites with the same topic. All members of the club put on our site a little navigation on the ring. On it the user can move to the next, previous and main sites of the ring.
Subscribe to:
Posts (Atom)