I came across a patent for seo software. The inventors are Ray Grieselhuber, Brian Bartell, Dema Zlotin, and Russ Man. It’s called “Centralized web-based software solution for search engine optimization” and it was published on the 12th June 2008.
They have patented a piece of software for SEO:
“In one aspect, the invention provides a system and method for modifying one or more features of a website in order to optimize the website in accordance with an organic listing of the website at one or more search engines. The inventive systems and methods include using scored representations to represent different portions of data associated with a website. Such data may include, for example, data related to the construction of the website and/or data related to the traffic of one or more visitors to the website. The scored representations may be combined with each other (e.g., by way of mathematical operations, such as addition, subtraction, multiplication, division, weighting and averaging) to achieve a result that indicates a feature of the website that may be modified to optimize a ranking of the website with respect to the organic listing of the website at one or more search engines.”
“… The solution 290 may make recommendations regarding improvements with respect to the site’s construction. For example, the solution 290 may make recommendations based on the size of one or more webpages (“pages”) belonging to a site. Alternative recommendations may pertain to whether keywords are embedded in a page’s title, meta content and/or headers. The solution 290 may also make recommendations based on traffic referrals from search engines or traffic-related data from directories and media outlets with respect to the organic ranking of a site. Media outlets may include data feeds, results from an API call and imports of files received as reports offline (i.e., not over the Internet) that pertain to Internet traffic patterns and the like. One of skill in the art will appreciate alternative recommendations .”
One of the claims is:
“…acquiring data associated with the website; generating a plurality of scored representations based upon the data; and combining the plurality of scored representations to achieve a result; recommending, based on the result, a modification to a parameter of the website in order to improve an organic ranking of the website with respect to one or more search engines.”
How many of us use statistical methods for SEO optimisation? I know I collect a lot of data, but not in the same format as this. Can this be reliable? Every site is very different and has different needs. A human is able to discuss this with the client and adapt the strategy depending on that. Can this system take those parameters into account also? It is a recommendation system, so I would think that you could adjust the weightings depending on the site you’re analysing. I would be interested to try this out in a free beta, but don’t see myself handing over a handful of cash just yet.
I’m all for applying data mining techniques to SEO, I’ve looked at this before and it is useful.