Google Penguin Update Chaos part 2

This is a continuation of yesterday’s article on penguin proofing your internet tactics.

Diversify Links

This one has hurt a lot of sites… There are three main aspects to link diversification. Blog networks, link types and anchor text.

Blog Networks

Blog networks were groups of website owners who would like to each others blogs and articles and form a giant link ring boosting the search engine rankings of everyone involved. Most of these networks charged to get your content distributed on the network. These were extremely easy for Google to infiltrate and they devalued any links from these networks. Some website owners had built there entire SEO strategies around these networks and they lost all of their hard work, money spent and rankings in one day. You need to diversify and keep your link presence natural!

Link Types

A link is a link but not all pages are created equal. A quick scan of link providers on Fiverr shows sellers offering .EDU links, .GOV links, wiki site links, article links, press releases, bookmarks, directories, blogrolls, forum profiles and more. Which of these are the smart ones to use? All of them… moderation.

Google isn’t perfect but they’re also not dumb. If your website has 100 links today and next week jumps to 1,000 and all 1,000 are from .EDU blogs than that might look suspicious. There are a LOT of people at this point who will point out that they’ve done this on numerous sites and are still fine, that the internet is way too big for Google to pay attention to stuff like this etc. My counter argument is “why risk it?”. If you knew that Google had a short list of websites that they manually analyzed once a month for suspicious activity and you knew that you were on it, how would you adjust your link building?

Now ask yourself if it’s worth the risk to be more aggressive than that. If you have 20 sites that are just numbers to you and you don’t mind risking them, be my guest. If however you only have a few sites (or just one) that you really care about, caution is probably the best policy.

This is not to say “stay away from automation”. You can automated smart methodology or you can automate dangerous methodology. Nobody wants to manually submit your site to 500 internet directories but instead of automating all 500 in a day, why not do 5 a day for 100 days. I know Google won’t find them all at once anyways but if we can make it look natural, let’s do it.

Anchor Text

Now that we’ve discussed where and when to put your links, lets touch on what they should say. Google’s Penguin update looks as though it has placed a high value on anchor text diversity.

If you started a website called “Fred’s house of used boxes” and you want to rank for the term “used cardboard boxes” then you probably use “used cardboard boxes” in all of your link building efforts. STOP!! If you get 100 natural links from other real life humans the odds are 90%+ of them aren’t going to use your keywords as the link text so you shouldn’t either. Over half of the links to your site are going to be . a lot are also going to be of the “click here” variety. Keep the link natural in amount and in appearance.

The final thing I want to touch on regarding Google Penguin is “above the fold” content. The phrase “above the fold” means what content is showing on a normal sized computer monitor for your site if a user didn’t scroll down. Content at the top of your page is assumed to be more important than content 20 pages below that most people won’t get to in standard web browsing.

There have been several instanced where Google penguin has appeared to punish sites with little “above the fold content”. This can be due to excessive ads, a large “call to action” form like “enter your email here” etc.

Google is assuming that sites that do this don’t care as much about providing content to the user as sites who don’t sue these tactics so adjust accordingly.

Most site owners weren’t affected by the Google Penguin update but it’s still important to keep up to date on Google’s opinions on matters to make sure that what you do today doesn’t sink your rankings after a future update.

Posted in Uncategorized | Leave a comment

Google Penguin Update Chaos part 1

The SEO world has been abuzz the past few weeks over Google’s latest
algorithm update which they have code named “Penguin”. It is important
to note that other some other Google updates, this is not a penalty,
but rather a modification to their algorithm. If your site dropped
recently it wasn’t hit with a manual penalty, you just need to fix
whatever it is on your site that the new algorithm doesn’t like.

Google has been rather vague as usual about what the new algorithm
doesn’t like. They use generic terms like “over optimization” and show
examples which are beyond blatant like sites with literally hundreds
of keywords stuffed into the meta tags.

The best way to find out what types of sites are having problems is
ironically to monitor Google’s own forums to see which website owners
are complaining. Owners often post their sites and how bad of a
ranking drop they’ve experienced. By looking at enough of these posts,
you can get a good feel for what’s being targeted.

Even if your site hasn’t suffered a drop in the rankings it’s probably
a good time to do a small audit of your SEO practices to make sure
that you’re not performing methods that were considered legitimate a
year or two ago but have now drawn the wrath of Google.

One other important thing to note is that like other Google updates,
this is no doubt a work in progress. If you read SEO forums you’ve no
doubt seen examples people have posted of sites with virtually no
worthwhile content which have suddenly leapt into the top ten rankings
of valuable terms due to several of the old top ten inhabitants having
their rankings plummet due to the new algorithm. I have no doubt that
these rankings will be short lived and the cream will generally rise
back to the top. Google’s SEO motto has always been “if you have a
site with great content that people love, things are changing in your
favor”. That remains as true now as it ever has, as long as you don’t
do anything to piss Google off.

Target 1: Over Optimization

The term “over optimization” is scary. I’ve got a WordPress plugin designed to help my optimize my posts or pages if I’m trying to target a specific term. How the heck am I supposed to know when my optimizing crosses the line into over optimizing?
My plan to keep On Google’s good side is as follows:

Limit each post/page to 2 meta keywords

I know it’s tempting to go for 6-7 but it’s probably not worth it. It’s a field that you don’t want to leave blank but pick your best two terms and let the others go.

All but ignore keyword density

I’ve never been a big fan of the “you need 4% of your posts to be keywords” crowd but now it’s more than a matter of choice. If you write quality content and afterwards realized you only used your main keyword phrase two or three times, don’t worry about it. Keeping your content natural has crossed the line from being smart to being a necessity. If you’ve got a spot to use your keywords an extra time go for it but don’t force in extra uses just to bump your keyword density.

I’ll discuss more penguin proofing tactics in tomorrows post.

Posted in Uncategorized | 1 Comment

The Importance of LSI Keywords in SEO

If you’ve never heard of the term LSI keywords, you’re not alone. LSI stands for Latent Semantic Index which does nothing to clarify anybody’s understanding of the topic. Without getting into pages of technical jargon LSI attempts to identify relationships between terms and concepts in text. Search engines use LSI so it’s important to understand how it works.

With Google Panda updates and Penguin update the SEO is in a state of upheaval. Some sites with plenty of quality content are dropping in the rankings while a few garbage sites with very little quality content are in the top ten for some competitive terms. I’m going to write more about current industry theories and what the Google Penguin update is targeting tomorrow.

LSIs are basically related keywords and with recent updates they’ve become more important than ever. If I’m creating content that I hope will rank for the term “travel through Europe” it’s fine to have that term in the title and use it a time or two in my main body but Google is likely also checking for the presence of other terms to see if my article is REALLY about travel through Europe like your title claims it is.

What terms could it be checking for? How about “backpacking”, “trains”, “euro rail” and “bus”? The presence of terms like this would help indicate to Google that your article was indeed about travel in Europe. LSI usually isn’t an issue if you’re writing high quality content because your article is likely to naturally use keywords related to your main topic. Where people run into problems is when they use poor quality content (scraped from low quality sources like PLR article, poorly spun content etc.) or when they try to keyword stuff and use their main term an unnatural number of times in lieu of using related terms.

If you’re writing an article and would like to view a few LSI keyword options there are a few ways to do it. The way most people find them is by going straight to the source and using Google’s own AdWords keywords tool. It’s 100% free to use and provides good results.

If you’re using WordPress, there are several SEO plugins which will enable you to view LSI keywords while you’re writing your page or post. I use a plugin called WP Easy SEO and I love it.

The moral of the LSI story is that with the SEO landscape in such flux, using quality and original content is more important than ever and be very careful not to “over optimize” your keywords in your post but instead use natural synonyms like those suggested by the Google keywords tool.

Posted in Do it Yourself SEO Tips | Leave a comment

Using Footprints for SEO 101

If you’re the type of person who realizes the importance of having diverse backlinks to your site and therefore often find yourself searching for spots to acquire these links then this post may be extremely relevant to your interests.

Let’s pretend that you’re a science fiction author looking for blogs and forums to post relevant comments on. You can Google science fiction related topics all day and come up with some good spots to post but this is an extremely time consuming process. Most of the sites you visit won’t have comment functionality and many others will have topics older than a set time frame closed for comments. What you need to dramatically improve your efficiency at finding spots to post in is the footprint.

A footprint is merely an indicator that a certain criteria exists on a website. For instance most WordPress blogs have “powered by WordPress” along the bottom footer. Therefore if you wanted to find WordPress blogs about your topic to comment on you could add a “powered by WordPress” along with your target topic in Google and your results would be almost exclusively WordPress blogs. This is an extremely basic footprint but it’s a good way to demonstrate the concept.

I’m getting ready to show you an example of an “advanced” Google search then dissect it piece by piece so you can modify it to fit your needs. The following line is what you would type into Google. inurl:blog "post a comment" "2012" -"comments closed" science fiction books

The topic being Googled for is obviously science fiction books but now let’s take a closer look at all of the modifiers which I placed before my target term.

The first modifier is This limits your searches sites and eliminates all others. This isn’t something that you usually need but it can be handy to hunt for specific domain types (.edu or .gov) or to try to target specific countries (.uk).

The next modifier is inurl:blog. This limits your searches to sites which have the word blog somewhere in the url. Once again this is not a mandatory option but it will let you get creative.

The next modifier is “post a comment”. This means you will only see results where the phrase “post a comment” appears on the page. This is obviously a HUGE help when you’re looking for places to comment.

The next modifier is the simplest. “2012” just limits your search to a query to pages which contain the text “2012”. If all you’re after is a link then you may not care how fresh the page is but if you’re trying to find fresh posts to comment on and draw visitors, then you want to look for recent posts. I’ve also added the current month in my queries before to limit my search results even further.

The next modifier is –“comments closed”. Notice there is no space between the minus sign and the opening quotation mark. This will eliminate any page which contains the text “comments closed”. This is another big help when your goal is to leave comments.

The phrase science fiction books is the meat and potatoes of your query. This is where you would put the terms you’re actually looking for.

The example I’ve just shown is not some magical search query, just some features which will let you create some very specific searches. Make a few searches, find a couple that yield good results for your niche, save them as bookmarks and use them whenever you want to get an extra link or two.

I’ll think you’ll find using footprints to be a huge time saver and most importantly let you convert the chore of building a few links to your site into a task which can be completed in a few minutes. SEO is a time consuming processes and anything we can do to automate things like this are invaluable.

Posted in Do it Yourself SEO Tips | 47 Comments

What the Heck is a Hit? Or, Why Don’t my Google Analytics and Hostgator Match?

what is a website hitI wrote a post last week about Guarantees in the SEO world. I have no problem with SEO providers who offer guarantees because I get why they do it. If I’m going after a client but they end up picking SEO firm X because X offers a guarantee and I don’t, I will be offering a guarantee from there on out. What I do have a problem with is SEO providers misrepresenting the value of getting a top ranking in Google for a term with no traffic. Deceiving the client in any field just isn’t professional. The reason I bring this up is because I’m going to talk about another topic today that gets its fair share of sleazy SEO providers exploiting it, the mysterious website “hit”.

Last week I was visiting a business forum I heard mentioned in a podcast when I saw a post where someone had asked why the amount of web traffic that Hostgator said they got was three times the amount that Google Analytics said. I knew exactly what the answer was so I jumped in and explained that Google Analytics will tell you the number of visits you had, the number of unique visitors you had and the number of page views you had. All of these are useful numbers to know. The number that is usually pulled from Hostgator or other web hosts is the “hit”. You would think that a hit would be the same as the number of page views but it is usually far, far higher. A “hit” from a web host is a count of files the server had to load for your site.

When Sally clicks a page on my wife’s site, Google analytics counts that as a page view. Hostgator counts the text and basic framework of the page as a hit, the Amazon banner down the right side as a hit, the ad for her graphics designer as a hit etc. etc. One page view could be a three dozen hits depending on how many elements your page has on it. If you would like to see proof, here is a Google Analytics Screen shot from my wife’s site for March 25th followed by a screen shot of our Hostgator stats for March.

From Google Analytics For March 25, 2012:

Google Analytics Screen Shot

From Hostgator For All of March, 2012:

Hostgator Screen Shot

You can see 2,282 page views in Google with 37,615 “hits” according to Hostgator. Neither of these numbers are wrong, they are just completely different metrics. That said why would anyone in the modern era of analytic options still use “hits”? The other numbers in Hostgator aren’t much better as they to are inflated by iframes or other page within a page elements. Some people don’t understand the difference and there’s nothing wrong with that. All of this stuff is a learning experience for all of us. What I have unfortunately witnessed a few times are people who are being deceived by those handling their websites.

I’ve seen several instances where people whose websites have no Alexa rankings or rankings in the 6-8 million range brag about their website getting hundreds of visitors a day. I KNOW the smart thing to do is to nod my head and keep my mouth shut but I genuinely try to help people so I will try to explain to them that their website’s Alexa ranking doesn’t indicate that level of traffic and that they should check out Google Analytics to get great information about their site. Multiple times my reward for trying to help someone get a good grasp of their websites activities is a defensive reply and assurance that they know the numbers are accurate because their “web guy has shown them proof”. That is the point where I keep my mouth shut for good.

If you’ve ever wondered exactly what a hit was, now you know. Now you can wonder why anybody would use them as a metric unless your purpose was to make a site appear far more active than it is.

Posted in Uncategorized | 3 Comments

Dragon Naturally Speaking 11.5 Review

Dragon Naturally speaking 11.5When I decided to do an SEO blog I told myself to try to commit to doing one post a day. So far so good on my SEO version of a new year’s resolution. I’ve had some dumb ideas in my life and the latest on the list is thinking that Dragon Naturally Speaking would help me “write” more articles and reach my goals.

For those who aren’t familiar with Dragon Naturally Speaking (DNS) it’s a program designed to let you speak into a microphone while it translates what you’re saying into text. I thought that since I can talk faster than I could type, I would give it a try. I did my due diligence and read a lot of reviews on Amazon and on a few other sites. It seemed like 75-80% of the reviewers thought it was an awesome product and a big help. The remaining 20-25% said it either didn’t work well for them or that it was really buggy. Believe it or not, both sides were right except for the big help part.

It is an incredible piece of software. I spend some time reading the practice articles while it learned my voice. Once that was done then I started describing what was going on in the baseball game on tv and it was doing a really, REALLY good job of translating. It would definitely take some checking and tweaking before you could post the text it translated but it was still impressive.

The problem came when I realized that I was speaking very slowly and very clearly. When I started to speed up my speech a little bit the accuracy went down in a hurry. I couldn’t come anywhere close to normal speech speed before it started looking like my cat jumped on my keyboard.

I was starting to wonder if the product would be the miracle cure I thought it would be but I figured that I would try it out a little longer. That’s when I started to see signs of the other complaints, the freezing up. The program would be working fine but the second you tried to do anything else on the machine, it would freeze. I’m not using a super computer but my laptop is not a lightweight by any means. After I had two restart my computer two or three times I uninstalled it and haven’t touched it since.

I really wanted to like this product and incorporate it into my article writing routine but I just couldn’t do it. I guess the 75-80% of users who love it have adjusted to speaking slowly and clearly so it works for them but if I have to do that, I’ll just go ahead and type it.

Posted in SEO Software Reviews | 2 Comments

Using Proxies for SEO and Internet Marketing 101 (Part 2)

Proxy Article ImageYesterday I talked about how my journey down the path of SEO enlightenment led me to the use of proxies and today I’m going to talk about the different types of proxies available and when and why you would use them.

In my opinion there are three categories of proxies:

  • Public Proxies
  • Low Grade Private Proxies
  • High Grade Private Proxies

Public Proxies

Public proxies are the easiest of these to get and they can be 100% free. There’s software out there (I use one called proxy harvester) that lets you push a button, go do whatever you want for half an hour and there will be 300 or so proxies waiting for you when you get back. These are HORRIBLE quality proxies that a ton of other people will be using as well. Later that night only 220 or so of the proxies will be working, the next day the number will be in the mid-100s and so on. When you’re using these you need to run a fresh list daily.

If you’re wondering what (besides the complete lack of reliability) makes these proxies so low in quality it’s that tons of other people are probably using the same proxy and you have no idea what horrible, horrible spam filled task they’re using them for. If you tried to use these proxies to do things tied to a specific account (like twitter) in any way that account probably wouldn’t last too long. These proxies are only good for a few things like scraping the internet looking for footprints or other automated bots that search the internet, but don’t try to post any content.

You’ll see a lot of services advertising you send you free daily lists of clean proxies to use. I’ve never tried any of them because they’re extremely quick and easy to fetch for yourself.

Low Grade Private Proxies

This category isn’t nearly as cut and dry as public proxies but it does exist. No place will say that they offer “low grade” private proxies just “private proxies”. In fact they will all claim to be high quality. How do I tell the difference? By the reputation of the provider.

The bad news is that private proxy services are tough to judge. The good news is that it usually doesn’t matter. I have ten private proxies through what I’m sure is a low grade private proxy provider that I found on some forums. The provider has a ton of positive feedback and cheap prices (under $11 a month for 10 proxies). The proxies have already changed once in the last month which is a good thing for some users but a bit of an inconvenience for me. I haven’t used this service long enough to recommend them but if I’m still pleased with it in a few months I’ll do a post on them.

I bought these ten proxies to help speed up my SEO rank checking and research efforts and they work great. I trust nobody else is using these at the moment and I feel comfortable with them running queries that are tied in some cases to my Google accounts (keyword research). People that use automated posting tools say they use private proxies because the success rate for posts or directory submissions is far higher with private proxies than with public ones.

I saw a study a few weeks ago where a user submitted some sites to 66 different internet directories multiple times. One batch was with public proxies scraped from the internet, one with private proxies and one with no proxy at all, just his natural ip address. After 45 days about half of the private proxies and no proxy links were alive and well while NONE of the public proxy links survived. This isn’t why I use private proxies but it’s the reason most people do.

High Grade Private Proxies

A private proxy is supposed to guarantee that for that period of time, nobody else but you will be using that ip address. What they oftentimes can’t guarantee you is what the last person who used that ip address did with it. There are a few proxy services out there that really strive to provide a high quality product and luckily for me my first (and second) proxy purchase was from a quality provider. I’ve been using for almost a year now and have had nothing but great results. When I bought my first proxy from them I got an email asking me what I was going to be using it for. I thought that was really weird but I told them to check my search engine rankings and they quickly set up my proxy.

Down the line I learned that the reason they asked that question is certain blocks of ip addresses get burned for certain uses, especially twitter. If I had said that I was going to use that ip address for tweet adder then they would have made certain to give me an ip address in a block that is twitter friendly. That kind of attention to detail is not common in the proxy world.

The only twitter account I have is the one I started last week for this website but a lot of people have a lot of twitter accounts for different websites, businesses etc. The danger is that if twitter gets upset enough to ban one of your twitter accounts they often ban all of them. Depending on your business model this can be a devastating blow. One safety precaution that some use is to keep their twitter accounts separate from each other including using a different ip address for each account. A lot of users say that it’s safe to have up to five twitter accounts from an ip address but with proxies being as cheap as they are, it’s probably better to be safe if building multiple twitter accounts is going to be a big part of your strategy.

If you’re going to buy proxies for use with twitter, stick to companies that have been around for long time, that have a long history of great reviews and say that they are safe for twitter. Make sure you spell out ahead of time exactly how you plan to use these proxies and place the priority on protecting your twitter assets rather than trying to save a dollar or two a month.

Posted in Do it Yourself SEO Tips, Uncategorized | Leave a comment

Using Proxies for SEO and Internet Marketing 101 (Part 1)

visualization of proxiesProxies are basically little digital gateways that allow internet traffic to pass through them and servers can be set up to play the role of a middle man between you and your intended internet destination. Using proxies can provide you with some anonymity and lots of people use proxies for a lot of different reasons. Some countries ban certain websites from their version of the internet and people in those countries can use proxies to bypass those restrictions. Some download sites may only allow one download an hour from your computer so a few users switch proxies after every download to get around this limitation.

95% of internet users have no need for proxies and therefore either don’t know about them or never think about them. Two years ago I never used proxies but now that I’ve been involved with SEO work I pay for 15 private proxies each month and use countless other “disposable” proxies. The crazy part is that’s chump change in the SEO world where proxies are often bought and sold by the hundred. If you’re wondering what the heck I’m up to that I need all of these proxies, I’ll explain.

The first piece of SEO software I bought was SEO powersuite. I still use the Rank Tracker program every day to track my website’s rankings in search engines. For my wife’s main website I track her position for 92 terms in 3 different search engines every day. When I first started using Rank Tracker a full report took around an hour. The downside was that oftentimes when my wife or I would try to use Google later in the day it was making us enter in a captcha for every search we wanted to do. Google frowns upon the use of automated tools to do keyword research and all of the activity coming from our ip address was making them place our ip on a short term (few hours) “slow down” list.

After a few days of that I ponied up a whopping $2.50 a month and bought my first private proxy. That could possibly be the first time in history that someone bought one proxy by itself but they were willing to sell it so I bought it. The proxy didn’t affect the speed of my rank checking one bit and it saved me from upsetting Google from the ip address that we used for our day to day browsing.

A few months after that several of the search engines starting changing a few things to make life more difficult for those trying to use automated tools. SEO Powersuite did a great job adjusting to the changes like they always do but the changes did make the process quite a bit slower. What used to take one hour would now take several hours. And that was just for one site! The answer was to either grow more patient (not going to happen) or buy more proxies. I upgraded my private proxy service from 1 proxy to 5 proxies and uploaded them into rank tracker. Now instead of checking one ranking at a time per search engine I could check 5 different rankings at a time per search engine and for all intents and purposes it looks like the requests are coming from five different locations. This took the process back down under an hour and all was right in the world.

A few weeks ago I was doing a large amount of keyword research for a friend and my queries were taking quite a while to run so I grabbed ten more private proxies from a different source. I put those into rank tracker and raised the number of processes from 5 to 15. What was taking an hour now consistently takes 8 minutes. Now I’m starting to see why people make such a big deal about proxies 🙂

In my post tomorrow I’m going to talk about the different types of proxies and why you need to make sure you’re using the right proxies for the right job.

Posted in Do it Yourself SEO Tips | 6 Comments

How I Perform my Initial Website SEO Analysis for a Client

I just finished doing an initial website analysis for a small internet startup in a specific niche in the vintage fashion accessory market. While the process is fresh in my mind, I thought I would get it down “on paper” to share with you all.

This is a three part process where I:

  • Perform keyword research
  • Analyze the client’s site for errors and on-site optimization
  • Identify potential backlink sources

I asked the client for a link to his website, 5-10 keywords he would like to rank for in search engines and “any other pertinent information” that would help me with my research. This client gave me his website’s address and 6 keywords so I got exactly what I needed to start.

Phase One: Keyword Research

The first thing to do is to take the list of keywords that the client provides and make it much, much larger. You never know what keyword terms will be good ones, so it’s best to start with as large a list as possible. I usually start off with Google’s Adwords and autocomplete data. Autocomplete is the data from when Google looks at what you’ve typed into the search bar and makes a guess on the rest of the words you want to type in. This is based on the queries of other users, who typed in similar words to start. These and other data sources are obviously a huge help in identifying new keywords. There are a couple of other services like Yandex that provide similar research options.

This case was unique because the client’s keywords were so specific that they were under the radar for the usual methods. That is rare. I went to plan B, which was to sick Google on the client’s six terms, and then tell me what other keywords were being used by the top 10 other sites. I’m rather proud of this methodology 🙂 .

That search took a few minutes and resulted in a list of around 500 keyword options. I went through the 500 and picked around 45 that I thought were relevant to my client’s business. I then took those 45 and the original six and analyzed all of them for competition (the number of other sites fighting for the terms) and how many monthly searches each term got. I then take the number of searches, square that number then divide it by the competition. That number is called the keyword efficiency index or KEI. When I then sort by KEI it gives me a great look at which keywords will be good targets.

Phase Two: On-Site Analysis

Phase two is far more automated than phase one. I push a button to start up a tool, type in some info, push another button a few minutes later and after 10-30 minutes that job is done and I can generate the report. If this step was a tiny bit easier, I could outsource it to my bulldog.

The report gives basic info about the client’s website, such as how long it’s been registered, traffic rank, page rank etc. but then also checks for things like and whether search engines can crawl all of your pages. It can also identify which pages have no descriptions or meta data, have duplicate descriptions or meta data (you want unique descriptions), and/or have duplicate titles or no titles at all.

For this client’s site, the report identified 27 pages with no title tag, 13 pages with duplicate titles (5 different titles) and 46 pages with no description. The good news was that there were no structural errors (i.e. broken links) and all of the pages were able to be crawled by search engines.

This is a GREAT report to give to my client, who can then convert it into a quick to-do list for whoever is responsible for his on-site optimization.

Phase Three: Identify Potential Backlink Sources

I looked at the final list of keywords and picked one that had a great combination of a high number of monthly searches and relatively low competition. When I typed that term into Google, the companies in the positions of 1 & 3 were larger chains with many categories including my client’s specific niche. Positions 2 and 4 were occupied by smaller, specialty sites like my client’s business. I grabbed both of those websites and got ready to do some competitor analysis.

I fired up another tool in my SEO arsenal and ran it against both targets. This tool checks several search engines and other data sources to find every link on the internet that points back to the target site. The tool then analyzes a ton of different factors like age of domain, number of inbound & outbound links, traffic estimates etc., in order to assign a weighted value to every link on that site. The higher that value, the more this tool is estimating that the link is worth to the competitor’s site. The algorithm to estimate value is constantly changing and does a great job.

Now my client has a list of links boosting the websites of two of his most successful competitors sorted by the value of the links. The obvious strategy now is for my client (or someone on his behalf) to start at the top of the list and try to obtain backlinks from the same places. Some will be as simple as a blog or forum comment while others might require submission to a link directory. Some may require an email soliciting a link exchange and quite a few won’t be possible at all (e.g., a website that hasn’t been updated since 2003 etc.)


The client now has a list of issues he should address on his site. The client has a list of 50 keyword word terms and a great idea of which terms will give him the most results for his effort. Finally, the client has a list of targets which have proven successful for his competitors to use in his link building efforts.

In my opinion, that is an ideal road map for building the foundation of success. In looking at the keywords and their potential, I believe the client will be extremely successful if he puts in the effort capitalizing on this information.

NOTE: This is my initial analysis of the website only. I also evaluate and make recommendations on other facets of my clients online presence such as social media, mailing lists and other personalized issues.

Posted in Uncategorized | 6 Comments

Link Assistant Review and Overview – SEO Powersuite

The fourth and final tool in the SEO PowerSuite family that I’ll take a look at is Link Assistant. This is by far and away the tool I use the least of the four. That’s not to say that this is a poor piece of software, I just don’t deal with link exchanges much anymore.

Link Assistant has two main areas of functionality. Area one attempts to identify partners for you to exchange links with. The second area deals with managing those relationships and links once you’ve got them.

The ways that Link Assistant can try to find link exchange partners for you are:

  • Find sites by keyword search
  • Find sites with link submission forms
  • Find sites that link to your competitors
  • Find sites that already link to you
  • Harvest all URLs on webpages you specify
  • Perform a deep scan on the website you specify

When you pick any of these options and enter the information required, Link Assistant will spend a few minutes searching Google and return with up to 30 possible link partners (with a PR rank of 3 or higher) with their website name, page title and email address if it can. The fact that it searches the website and whois for contact info is a great little feature.

Out of the 30 results, 5-10 may be useable. You check the ones you want and push next. The program will then ask you if you want to ignore the ones you didn’t pick from future searches. I always choose yes.

Once you’ve got some targets for a link exchange you can use Link Assistant to send emails, build link pages automatically for your website with links to all link partners and monitor inbound links to make sure they exist and are DOFOLLOW links. This is all really cool functionality; I just don’t ever find myself using it.

I made a short YouTube video showing how I use Website Auditor.

Honestly, I wouldn’t buy Link Assistant on it’s own but the other three programs in SEO PowerSuite are 100% must haves so I just get the bundle with all four. I used this program a lot when I first got it to find link partners and now that I’ve started this website I’ll start using it again and report back here with my impressions one year later.

As with all SEO PowerSuite tools, there is a no time limit near full functional free edition that you can download from the site. I used my free edition for well over a month while I was contemplating purchase of the product.

I want to make the same caveat about this piece of software that I made with Rank Tracker and SEO Spyglass. If you’re going to be using it often, you should invest in a few private or shared proxies and the captcha service. This has nothing to do with the software itself; it’s just the way search engines work. For my wife’s main site I check three search engines for 92 different terms in Rank Tracker. If I use no proxies the search will take a few hours. I use 5 private proxies I can get the search time to just under one hour and when I switched from 5 proxies to 15 I get the total time down to 8 minutes. Scans with this program can take a while.

Posted in Uncategorized | 6 Comments