Know SEO

Thanks to SEO optimization techniques it is possible to achieve an improvement in the position ( ranking ) of a website on search engines and, consequently, the increase in organic traffic volume The higher results in the SERP are more likely to feel (display ) and click.

These techniques are divided into On page (optimizations directly on the website) and Off page (activities outside the website) and are based on a long list of signals rankings , called “SEO Factors.”

Among the SEO activities stand out the optimization:

  • the structure of the site and the url ( url optmization )
  • accessibility of information by spiders and crawlers of search engines ( robots and sitemap optimization )
  • Source Code ( code optimization and error )
  • links ( link optimization )
  • image (image optimization)
  • Content (SEO copywriting)
  • of backlinks (backlinks optimization)

The activity of optimization for search engines includes various technical operations that are carried out on both the HTML code on the contents of the website pages as well as on the overall hypertext network of this Web domain in the archives of the search engines. The quality and quantity of hyperlinks to other Web domains that link to a particular domain, are fundamental to the placement of the considered domain. Indispensable is so even outdoor activity to the site, usually carried out with link building (common techniques is the guest posting ), dissemination of press releases and campaigns, article marketing quality.

The optimization for search engines (SEO) is distinguished by the Search Engine Marketing (SEM) as it has the goal of positioning in organic SERPs and not in the space allocated for results Pay per click (paid advertising). While SEO activities are linked to results in the medium and long term, the effects of the Pay per click are instant.

The professional specializing in optimization is SEO ( search engine optimizer , optimizer for search engines – that the acronym of that place). In large sites, or particularly complex, SEO remains in close contact with the team, needing, depending on the techniques used, the specific skills that are unlikely to fall back into a single person.

Optimization for Google

In the early twenty-first century the search engine Google has introduced a new element in the approach to the problem of indexing the web, the PageRank , an algorithm evaluation of the relevance of a page based on mathematical concepts. Over the next decade this value has been losing importance. Today it is considered one of the many parameters to be taken into account (and not the most important). Robert Metcalfe (inventor of Ethernet), called this algorithm as “the effect of the network” ( network effect ): a network is all the more useful, the higher the number of its users. Google indexes a page based on its content and links pointing to it. Not only that: using secret values 200 (or at least it is most of them), that interact with the Google algorithm to define the quality and various other factors contained in the sites (and blogs, as well as in useful services in web). In the various information that you need to know to get in peaks optimization for Google it is necessary that your site will be SEO friendly, or that it can be recognized as a website or blog facilitated with regard to the reading of the search engine bots. This takes particular care to many of the elements found on every page of a website, and any document uploaded on the web. The initial concept of Google PageRank was over time replaced by TrustRank, which is a more complex index, although there is now a way to be able to be measured in relation to a web site.

Factors Optimization

Factors at the page level (on page factors)

As for the pages, part of the optimization is to make sure that:

  • The page code is formally valid. It may be useful to deal with WCAG specifications to make better content “accessible” even for crawlers;
  • HTML tags : These tags (title tag, meta description) will be as semantically relevant to the page content (the ideal is to make use of LSI keywords, Latent Semantic Indexing). Insert preferably the main keyword at the beginning of the title tag , unless it is a branded keywords;
  • Images Tags : Search for images is the second type of search on Google. To ensure that the images are indexed and searchable by search engines, contributing to the searchability of the page in which they appear, the image must have a url that describes the content of the image (eg / images / mobile_rosso. jpg) and there must be a description ALT TAG (what appears to be descriptive of the image). Other information may be introduced in the TITLE and Caption (a table that has the purpose of describing the image and is used by the main cms). The main web app on specific images are able to configure these fields using the EXIF fields;
  • Load Speed page : search engines give much importance to this signal, so it is one of a good SEO priorities, which will bring a number of changes to the code (reduction of code lines. Css and. Js in particular), will implement systems caching , will build up a Content Delivery Network (CDN), compress the images;
  • Avoid if possible sending of parameters for the possible application server side through the insertion of a query ( query ) in the URL of the page, namely the presence of pairs parametro=valoreafter a question mark after the page address ( Some search engines will index and will download only the base page, without taking into account the query;
    • Furthermore, if the query contains specific session information (eg Session ID, which change for each visitor) even Google gets a unique URL for the page, and can draw the disparate negative conclusions. For this type of information is appropriate to use cookies;
  • Avoid using non-HTTP redirects (via), because it is said that the search engine follows them. It is also widely thought that their presence may penalize the Link Analysis Ranking of a page. Implement instead each redirection with a response of HTTP redirection (3xx codes); Furthermore, each redirection done by tag is contrary to paragraphs 7.4 and 7.5 of the WCAG 1.0;
  • Avoid if possible to serve different content to crawlers (Cloaking) trying to recognize the user agent string or the IP address of the bot that scans our pages. It is a potentially harmful practice that exposes (in extreme cases) to erase the very real risk of the engines index and in most other cases to a lack of enhancement of internal pages of a site; if this is not possible for technical reasons (for example to provide static content of a site built entirely in Flash), but it was necessary to implement a latching content, it is preferable to handle client-side , by the use of session cookies.
  • Implement the robots.txt and sitemap.xml file to indicate to crawlers of search engines what content to index and which to exclude from indexing process.
  • Duplicate content . Matt Cutts (ex-head of Google’s Webspam team) states that 25-30% of all content on the web are duplicates. Being a detractor, has care of SEO make efforts to remove them or to eliminate errors in the code that produce them (eg errors in paging or dynamic variables)

Factors at the domain level

With respect to the domain of the most important factors are:

  • Seniority domain;
  • Keyword in the top-level domain.

Social factors [ change 

  • Number of likes and followers on social networks (external search engines): it is unclear whether this data is or is not a ranking signal. There are those who endorse the idea of Matt Cutts, but the debate is open;
  • Shares and tweets, many SEO experts agree that these influence the algorithms of search engines.

Factors to user interaction level

  • Click-through rate Organic (CTR);
  • Bounce Rate ( Bounce Rate );
  • Dwell Time : is a value that makes the system the duration of the session, bounce rate and CTR.

Factors to backlinks level [ change | wikitext editing ]

The backlinks are one of the most important elements in SEO:

  • Number of backlinks dofollow: very determining factor, preferably from sites with different IP addresses and contextual (consistent semantic content);
  • anchor text backlinks: it is used to determine the relevance of the link with respect to the subject. It is no coincidence, in this respect, that the text of the anchors often coincides with the research being optimized, especially in the more “pushed” (and hence link building campaigns, the most risky in terms of penalties).

Optimization for PDF file [ change | wikitext editing ]

The optimization of PDF documents provides some interventions at the time the file was created.

The most important elements to be filled properly for the purpose of placement on the SERPs of a PDF document:

  • the filename, if it consists of multiple keys using hyphens to separate them; For example, if the PDF file is about pets will use the following file name: animal-domestici.pdf
  • to document the internal properties: click “File” – “Properties” and necessarily fill the “Title”, “Subject” (that would be the description), “Author” (you can put your site name) and “Keywords”.
  • The title of the document: If the document does not have a title, will Google to attribute one. Better, then, enter the title you want to give to the document using the font: Verdana, Italico and Centered.

As for the inclusion of a clickable link within the pages, note that a PDF file is read by Google as a kind of web page of its own and is therefore a backlinkal own site. The important thing is to tackle questions related to the link, that is, enter the exact reference page. For example, if we have a PDF that talks about pets on a portal of animals, it will be better to insert the link to the page that speaks exactly to domestic ones.

source Wikipedia