Why Domain Authority Is Used Over Page Rank in Online Reputation Management

Prominence has an inherent bias. There is abundant evidence that the way options are presented can significantly influence people’s choices, and more prominent options could be favored disproportionately. For example, Ho and Imai (2006) and Meredith and Salant (2007) observe that being listed first on the ballot paper can significantly increase a candidate’s vote share. Lohse (1997) found that adverts which are larger, colorful, with graphics, or near the beginning of a heading, are more likely to catch a reader’s attention. Likewise, prominent results in search engines, those that are at the top of the search engine result page get most of the clicks. In commercial terms, prominence also leads to Increased profits for options at the top of search engines. It stands to reason that the negative results will have the opposite effect. This is the idea behind reputation management.

The first step in suppression for practitioners is to make an assessment of how much time it will take to deliver a standard milestone. Usually the first and most of the time the last milestone is to have the negative listing in the search engine result page moved (or suppressed) to the 2nd page of a search engine. This has become a standard offering in online reputation management. The reason for this is very simple. A multitude of research indicates that the 2nd page gets at least 15 times less clicks and visibility than the first page. The drop from the 2nd page is no where near as dramatic. In fact one of the latest studies conducted in 2014, found that On average, 71.33% of searches result in a page one organic click. Page two gets 3.99% and three get only 1.66%. The details and review of past studies can all be found here: https://moz.com/blog/google-organic-click-through-rates-in-2014

Currently, reputation managers use their experience to estimate what projects must cost during the pre-onboarding phase of the client relation. Typically, the potential client and reputation manager will discuss what queries and close variations will output search results with the prominently displayed negative results. The conversation will lead the way to an estimate that is currently driven by the reputation manager’s confidence to suppress the results as well as the “pain” or “damage” felt by the consumer of the product who suffers the consequences of the reputation problem even while it is being suppressed.

Reputation managers mostly look at the rank of the negative result based on a 10 position system as well as the authority of the domain and page of the negative result to ballpark the number of months the reputation manager or it’s firm needs to be retained to reach the first milestone of the 2nd page of Google.

We use a system which is represented by various inputs and an output detailing the project timeline to the basic milestone of suppressing the negative results to page 2 of google search results and beyond.

INPUTS

A) Position of highest ranking negative results.

Google search result page has, by default, 10 results. As the vast majority of Goolge users do not change the default. These 10 results are used as a benchmark to measure prominence and track both evidence of project difficulty and progress to the milestone which is technically considered to be achieved when the negative result reaches the 11th position.

While some pages might have more than 10 results, these usually occur when there are items listed side by side horizontally across what may be considered to be a row. So for the purpose of this input variable, each row equals a position.

B) Time elapsed since the negative result has been published at the source website.

C) Authority of the website the original negative content has been published in. Readily made inputs are obtainable from various web based tools. For example:

Domain Authority: Domain Authority is a score (on a 100-point scale) that predicts how well a website will rank on search engines. To determine Domain Authority, MOZ employs machine learning against Google’s algorithm to best model how search engine results are generated. Over 40 signals are included in this calculation. Domain Authority score will often fluctuate, and it is used by the SEO community as a competitive metric against other sites. This Domain Authority score is provided by Moz, a SaaS company based in Seattle, Washington. It has an API that power many 3rd party tools such as AHREFS which publish this Domain Athority score. Domain Athority is an easily obtainable via many sources on the web as their is an API that communicates this value within third part tools used by the SEO community. It is likely that such a number is being loosly taken into consideration when”eyeballing” an estimate of SEO and suppresion work. But curently this compoent is not part of a fomula for developing an objective timeline, not in any tool that uses domain authority.

Page Authority:

Page Authority is a score (on a 100-point scale) developed by Moz that predicts how well a specific page will rank on search engines. It is based off data from the Mozscape web index and includes link counts, MozRank, MozTrust, and dozens of other factors. It uses a machine learning model to predictively find an algorithm that best correlates with rankings across the thousands of search results. While page authority is more specific, the values tend to be lower than domain authority scores. If used in our fomula, it will likely have a mutiplier compentent.

Page Rank:

Both Domain and Page Authority use machine learning to mimic a subset of Google ranking variables for webpages, but like our own formula, it is partially based on another propietary metric called Page Rank. PageRank is an algorithm, partially, used by Google Search to rank websites in their search engine results. PageRank was named after Larry Page, one of the founders of Google. PageRank is a way of measuring the importance of website pages. According to Google: PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites.
Source: https://www.google.com/insidesearch/howsearchworks/

Google recalculates this PageRank score every time it crawls the Web. The PageRank value of a page reflects the chance (probability) that the random surfer will land on that page by clicking on a link. Thus, not all links counted by PageRank have the same value.


One main disadvantage of PageRank is that it inherently will favor older pages. A new page, even a very good one, will not have many links unless it is part of an existing site. While page rank use links to score authority of a page, domain authority seems to also include other factors such as traffic, which may decouple from backlinks. Furthermore, PageRank was once available for the verified site maintainers through the Google Webmaster Tools interface. But Google no longer supports this, because various strategies were used to manipulate PageRank.

Finally, PageRank is agnostic of specific industry or subject areas and assesses a Website in the context of the totality of websites in the Internet. The results on the SERP page set the PageRank in the context of a specific keyword. In a less competitive subject area, even websites with a low PageRank can achieve high visibility in search engines as the highest ranked sites that match specific search words are positioned on the first positions in the SERPs. This is another reason we would use Domain Authority.

Trust Flow: Trust Flow, is a trademark metric of Majestic, and SEO tools company, like domain and page authority, it is also a score based on quality, on a scale between 0-100. Majestic collated many trusted seed sites based on a manual review of the web. This process forms the foundation of Majestic Trust Flow. Sites closely linked to a trusted seed site can see higher scores, whereas sites that may have some questionable links would see a much lower score. It is mentioed here as another potential proxy that can be used in our system.

SYSTEM:

The system is expressed in terms of formulas who’s objective is to use these 3 imputs in order to generate an estimate for the amount of time it would take to complete a project. This will allow service providers to estimate the project cost by multiplying the output by paid units such as man hours, wages and salaries. This would also allow insurance companies to estimate potential liability under a policy, thereby offering products that can share search engine reputational risk. It can also be used to establish policy rules. For example, the time lapse input (Input B) is the only input that does not have an upper bound limit and theirby it can be the most impactful. Insurance products may come with policies which stipulate a maximum number of time lapse before the prominence of a negative result is reported should the policy holder wish to have suppression work done at the expense of the insurer.

Formula 1: Output expressed in terms of weeks:

B X (1 + B/10) + (13 – A) X (1 + (DA/100)) = OUTPUT (Time to 2nd page in weeks)

This formula can now be used to to plugin monthly and weekly fees by the provider in any future market condition.

Example: NAMECO hires Reputation company XYZ tells them they have 2 negatives search results they would like to suppress to the 2nd page of Google under it’s company name search “NAMECO”. One in position 10 and the other in position 5. XYZ does it’s research and finds out that the result in position 5 was published 3 months ago, and the site has a domain authority of 80 (quite powerful). XYZ accepts the project and is asked to estimate the time it would take to suppress these results to page 2. Assume XYZ charges $2000 a month for it’s services.

XYZ would then use the fomula:

3 X (3+3/10) + (13-5) X (1 + 80/100) = 24.3 weeks. So between 6 and 7 months of service. In other words, the service budget should be between $12,000 and $14,000.

An insurance company could decide to insurance against such outlay (damage). It would ask the company to certify that at the time of the insurance application it certify that the query name is not generating any prominent negative results with a few pages of SERP and stipulate that it should be reported within 6 months of being published in order to qualify for the coverage expenditure. If the negative promient results were reported after being 7 months old in this example, the timeline and difficulty increase significantly and as a result it makes more expensive. Using the same example (all other things being equal):

7 X (7+7/10) + (13-5) X (1 + 80/100) = 68.3 weeks. So between 17 and 18 months of service. In other words, the service budget should be between $34,000 and $36,000.

The insurer can limit it’s liability by making sure the victim of a search reputation problem makes a claim within a reasonable delay.