Tue Jan 22 11:03:11 PST 2019

Even if Google is not using LSI, they launched 5 patents on phrase based research,
which are covered in this WMW thread
Temporal Analysis
Search engines can track how long things (sites, pages, links) have been in existence
and how quickly they change. They can track a huge amount of data such as
? How long a domain has been in existence
? How often page copy changes
? How much page copy changes
? How large a site is
? How quickly the site size changes
? How quickly link popularity builds
? How long any particular link exists
? How similar the link text is
? How a site changes in rank over time
? How related linking sites are
? How natural linkage data looks
In some cases, it makes sense for real websites to acquire a bunch of linkage data in
a burst. When news stories about a topic and search volumes on a particular term
are high, it would also make sense that some sites may acquire a large amount of
linkage data. If that link growth is natural, many of those links will be from high-
quality, active, frequently-updated sites. Unless you are doing viral marketing most
of the time, if links build naturally, they build more slowly and evenly.
If links build in huge spikes from low-quality sites, then search engines may
discount?or even apply a penalty?to the domain receiving that linkage data.
Stale pages may also be lowered in relevancy if other pages in the same topic are
referenced or updated more regularly. A page may be considered fresh if it
changes somewhat frequently or if it continues to acquire linkage data as time
passes. Certain types of queries (like news related ones, for instance) may improve
scoring for fresh documents.

Featured Post


Indeed SEO is Not Cost: SEO Training and Solutions Kindle Store : See all matching items $1.99 seo,optimizati...