1. User Oriented Concept
The Unnatural Links subject generates a lot of confusion among website owners, webmasters and even SEO professionals. I can tell you that the unnatural link concept is a hard to grasp concept for the majority of the people. You can easily “feel” the confusion that most people have if you read the Google Product Forums, where a lot of people talk about their unnatural link warnings and Google Penguin penalties.
Some other techniques that classify links are based on a certain “toxicity” level or “potential” risk. We consider these techniques generate a poor user experience and are only adding complexity to an already complex formula.
The route we took is to simplify the unnatural link understanding and disavowing process.
We developed the system in such a way that it will split the links in
- Unnatural Links.
- Suspect Links.
- OK Links.
As simple as that!
2. False Positive Ratio
It is important to have a really low incorrect detection ratio. To put it simple you would not want a system that detects only 10% of your unnatural links or misclassifies the good links as being unnatural.
This was hard to achieve. No automatic detection system provides 100% certainty (Google misclassifies links also … it is all about the final False Positive ratio).
We took the performance up to 97% percent well-done classifications and a false positive ratio of only 3% on our testing dataset.
3. Incorrect Metrics
An important factor for a well-done classification, are the metrics that are used in order to draw any valid conclusion out of it.
For example using external metrics such as the Google PR or the indexation status of a link in Google are flowed ways of identifying an unnatural link. That is because you simply are able to identify this link only after Google has potentially marked it as unnatural and are looking at Google as the major sign of an unnatural link. This means that using such metrics in an automatic system makes the system rely on things that have been already flagged. These might only work for sites that have already been penalized.
We do not use any external metrics in our algorithm in order to detect unnatural links. This made the development process harder but in the end more accurate and trustworthy.
4. Detection Algorithm
I am not going to share the algorithm that is used in order to classify links as natural or unnatural but I can tell you that this algorithm does not use external metrics and it relies on AI, in-depth content and link profile analysis in order to segment the so called “toxic” links from the natural ones. The rule set we use is based on the official Google Guidelines.
Let me give you a quick example considering a web-directory link. In the context of a natural looking link profile that web-directory link will not be flagged as unnatural as it simply is not. The same link put in a unnatural link profile will be looked from a different POV and will be flagged as unnatural due to the high amount of unnatural link patterns found in the suspect link profile.
You can read more on the official blog here.