We often hear about getting high quality links to our websites. It’s the same ol’ rumor that is being passed on since the 12th century, from noobie to noobie. This trend, more yet a plague is spreading exponentially across the infinite masses of self proclaimed greenhorn experts to the ears of their initiated students.
Yet when you ask them where to get these links, they turn off the lights, shut the windows and lock the doors. Fear NOT, my web traveler! Read my recipe for getting high quality backlinks and feast yourself on dominance of SERPs.
Getting To The Point
Right, all the unnecessary bosh aside and lets get to the point. With this method we’ll be getting highly relevant links from pages that were already written and can be found through the search engines. You can hardly get better quality links than these. However, for these links we’re gonna need to offer a compensation.
This can either be money, a link, free content or anything else that the owner of the website might be interested in. You can also tell the webmaster that your link is gonna enhance their visitors experience and they might give you a freebie. There are two ways to achieve our goal; manually or with the help of scripts.
Manually – The Snail Way
What we’re looking for here are contextual links. Go to google and utilize the [intext:”your keyword”] command (without [ & ]). This will display results of pages that google thinks are most relevant. Now you’re gonna attempt to get a link from them. Go through the pages and extract all the URLs that you think you can get a link on. You’re gonna filter out pages like wikipedia, about, cnn or similar sources.
Next you will select the urls and check them for pagerank. To do this is go to the multiple pagerank checker from prtool.info. The higher the pagerank the better. Now, sort your urls according to pagerank and start contacting webmasters to give you a link for a compensation. If you’re not competing for the exact same keywords, most webmasters will give you a link if your compensation is good.
As you see there is some work involved in obtaining these high quality links, but there is an easier way.
Speedy Software Way
There’s a great piece of software out there called allsubmitter. It was actually developed for directory and similar forms submission, but it’s much more than this. I don’t know how long I use it now, but I know that my internet marketing endeavors would be much harder without this great tool. There are three major functions in this program that can automate our process of finding potential links. These are:
a) Internal PageRank Checker – You can check up to around 2000 URLs for their corespondent pagerank with one click of a button
b) Inbuilt SERP Scrapper – Type your query and extract urls from all the major search engines
c) Inbuilt Position Checker – Find out where your website ranks for a given keyword in all major search engines
With all these functions you will be able to extract urls and sort them according to the pagerank in a few seconds. Doing it manually will probably take you around 15 – 30 minutes, but I wouldn’t know how long for sure. I’m always using allsubmitter for this.
Why Is This Technique So Powerful?
In one sentence, because it lets you utilize the current google algorithm without you knowing it. You could also say the technique is reverse engineering. Let me explain why. As we know google ranks keywords according to relevancy. The least relevant at the end of SERPs and the most relevant at the top.
With the intext command we’re gathering all the results that google thinks are most relevant. The keywords ALREADY rank top in google. And now you will be getting a link from them. All those highly relevant pages, they’re all linking to YOU. And that’s the secret power behind this technique.