I just completed an SEO audit (an alalysis of the points that make a site search engine friendly) of a directory and sent the owner and her webmaster a list of 20 things that need to be done to make it more search engine friendly. I tried to cover both on-page and off-page SEO points. This is a site I built about 6 years ago — haven’t optimized it for about 3 years as it was doing very well for its keywords and it only recently started to fall in the rankings — I suspect mainly because of link-building on the part of her competition.
For the off-site SEO, which is mainly various methods of link building, I bought a new SEO tool (something I rarely do) that does a pretty good job of data gathering on the backlinks to her main competitor’s website. Her main competitor within the last couple of years basically copied the format of her site, and her business model, and then went out and contacted all her advertisers to get them to advertise on his site. Pretty slimy. Then he’s done a better job of link-building over the last year or so, but that’s about to change based on the information I gathered using the SEO Spyglass program. It looks through all the Yahoo backlinks, and allows you to see which pages linking to a competitor have how much Google PageRank so you can contact them and get links from the same pages. It also notes whether those links are rel=nofollow, so you can ignore those. It’s a pretty handy tool for that one task. I have about half a dozen SEO programs that I use for various features when doing these SEO audits – none of them do an adequate job all by themselves, and the automated recommendations they make are often silly. (Such as, “Your top competitor has a keyword density of 7.2%. Your site only has a keyword density of 6.5%. You need to raise your keyword density by 0.7%”. I pay little attention to keyword density; as long as the keywords are used in the right places on a page, keyword density has not proven to be important to good rankings.)
Among the list of action items that this directory needs to take are:
1. Channeling the PageRank through her site; there are about 30 category pages, and they all linked to each other. I advised her to use rel=nofollow on many of those so that the link juice flows down to her main categories for her main keywords, instead of being spread all over the site.
2. Putting rel=nofollow on all links out to affiliate sites.
3. Fixing all Google crawl errors. There were a few.
4. Using actual page names of categories instead of cat=32, etc. So the page names will have a chance to help rank at Google for the keywords of the page.
5. Setting up canonical meta tags on all pages
6. Setting up internal authority “hubs” for her main keywords, where page A links to page B using a main keyword, and page B links back to page A with the same underlined link text. That often results in a double listing in Google for that keyword.
There were a lot more points but I’m not going to list them all out here. I use our proprietary checklist of about 60 points that we are certain can make a difference to a site’s Google ranking in the natural search results.
If you have specific questions about how to optimize a large, database-driven website such as this directory, feel free to contact me.
No comments yet.