• About
  • Services
  • Portfolio
  • Contact

Content Trends

Content Marketing, SEO and Digital Marketing

  • Home
  • Why Content Marketing?
  • SEO
  • Blogging Tips
  • Social Media
  • Newsletter

How Your Content Is Hurting You, and What You Can Do about It

August 16, 2013 by Alan Eggleston

seo content writing
You are most likely, SEO-savvy and using “white hat” SEO techniques and not losing rank because search engines are penalizing you for using “black hat” SEO techniques. And perhaps you know a little about optimization but not enough to write content that keeps you in the top rankings.

Here are 8 top reasons why your content is not making it to the top and easy remedies to fix these common mistakes.

#1. Lack of relevancy for your keyword

One of the most important factors for determining search ranking is relevancy. Is your content relevant to your keywords?
Every page must have a unique set of meta data and links that create that sense of relevancy. Many websites lack both.

  • Meta data relevancy – is your content relevant to the title tag and the description tag you placed on the page? The closer the tags relate to the words used by the searcher and the earlier you use them – in that same order – in the content on the page, the more relevancy your page has.
  • Link relevancy – are your links relevant to your content and keywords? The anchor text for links once were also key markers , but now a mix of keyword-rich anchor text and ever more general anchor text linking to still relevant content are more important.

Remedy: Improve content quality with better research, more data, using more links. No one is coming to your site to see you BS your way to a ranking. They want information. Make it your information, written in your voice, with your style and in your tone. Provide links to back up material.

#2. Low keyword use

This probably isn’t as big a problem as keyword stuffing. However, if you use the keyword only once or twice, say at the top of the page, and then ignore it thereafter, you’re probably committing this sin. Don’t over use keywords, but don’t under use them, either. They help make the page relevant.
Remedy: Make your topic clear. Mention your keyword a few times for clarity.

# 3. Irrelevant keyword use (keyword-stuffing)

This is usually a “black hat” SEO trick – stuffing the page with keywords. But sometimes it’s inadvertent. Sometimes it’s from an over abundance of caution. One writer on Technorati suggests we shouldn’t use a keyword more than three times, but search engines would suggest you use it only often enough to serve the reader well – not the search engines. And if you have a high word count, it may make more sense to use it more often than if you have a low word count. Use common sense.

Another sense of “irrelevant keyword use” also comes in the form of trying to fit in all the different varieties of a keyword, just in case someone uses them. Search engines usually differentiate various forms of a word to account for it in a search. Forcing words unnaturally into your content sets off alarms at search engines.
Remedy

  • Make better matches between keywords and meta data and meta data placement
  • better quality links
  • better variety of anchor text for links
  • Focus keyword use

#4. Low quality content

Search engines insist, “Write for the reader, not for the search engines.” What pleases the reader more than finding a treasure trove of information? Written in a format that makes it easy to pull out the data.

Poorly written, poorly spelled, poorly constructed content with little value is hard work for the reader and not at all pleasing search engines. Google Panda was also created to weed out low-quality content and that penalty will send your ranking south.
Remedy: Always, aim for high quality content, ie: content that provides value. Try to find topics that have not been indexed by search engines before and that will give you a competitive advantage in rankings. The key is to find a popular topic that has been covered extensively and give your own unique twist to it.

#5. Low word count

Low word count can be one sign of “thin content,” which could trigger the Google Panda penalty.
Longer content helps a search engine determine relevancy. If you provide fewer than 250 words, you may have a problem, although the quality of the text is far more important.
Some websites think shorter word counts are better: “People don’t want to read.” But that’s not true. Readers don’t want to wade through useless text to find value. Shorter sentences and shorter paragraphs aid reader scanning, while meatier content provides them more information – what they really want.
Remedy: A home page under 250 words doesn’t tell the reader much. A blog article of 300-400 words may not provide enough depth. 500-1000 words is a great goal, but write for quality and your audience.

#6. Scraped content (content lifted from other sources)

There is no value to reposting another’s material and you shouldn’t be rewarded for it. It’s lazy publishing, it’s plagiarism, and it’s unethical. You can certainly make “fair use” of short bits of other peoples’ work as a springboard to creating your own larger work, but literal picking up someone else’s work is wrong.
Remedy: Use only original content; use canonical tags in your own content to identify its originality.

#7. Duplicate content

Similarly, running your same material in multiple places on the Internet is wrong. There is a specific penalty for duplicating content. Even replacing a few words here and there doesn’t fool search engines.
Remedy: Don’t duplicate, rewrite! Cover news style with a capsule and link to the original story.

#8. Content you may not generate yourself but may affect your ranking

  • Auto-generated text (robotic fluff you sometimes see in comments): Seemingly random sets of words that don’t quite seem to make sense accompanied by strange looking URLs. It’s garbage meant to fool spam filters.
  • User-generated spam (comment or forum spam): This is often more sensible text and often written to appeal to your vanity, but sometimes contains spam keywords and certainly spam links. Occasionally, the links lead to OK pages but those pages then link to spam pages, which can negatively affect your ranking.

Remedy:

  • Monitor the comments and if something seems odd about a comment, don’t post it. More than likely it’s spam. Post guidelines about spam and police them. Spam and auto-generated text often make off-handed comments that have nothing to do with your topic – delete or send to the spam folder!
  • New: Google has just launched a new manual spam notification tool in Webmaster Tools to alert you when your site has been manually tagged for spam. Use it to reduce the effect of spam on your site.

Filed Under: Seo Tagged With: black hat SEO, duplicate content, keywords, link relevancy, low quality content, low word count, meta data relevancy, optimization, quality content, scraped content, search, search engines, seo, SEO techniques, white hat SEO, word count

There’s More to SEO in 2013 Than Google

August 4, 2013 by Alan Eggleston

In my last post for Web Content Blog, I discussed changes in SEO for 2013 focused on Google. That’s for two very good reasons. First, Google is dominant in search with over 66% of core searches. Second, Google talks about its changes while other search engines generally don’t. However, Bing broke through the 17% core search threshold for the first time this year – Search Engine Land, May 15, 2013.

The Search Industry

Because Google is dominant in search, it leads the rest of the pack in maintaining the technology. The others follow suit or make changes that help them market against Google. But occasionally, the others change their algorithms to suit their own business aims, and without alerting webmasters, it can send the industry into a tizzy. Bing, for instance, made three algorithm updates in 2012 and their forums lit up trying to figure out what was happening. To date in 2013, Bing has made only one algorithm update. Yahoo uses the Bing algorithm and so a change to the latter is a change to the former, although each tweaks its results to suit its needs.

There has always been a tension between search engines, which are naturally loyal to their search users, and webmasters on whom search engines rely for the sources of those searches. Search engines set the rules for how webmasters may run the gauntlet of a search engine indexing but have been little helpful in the navigation of the gauntlet.

Google is secretive with details about its updates, but it has communicated upcoming changes and it has answered questions, without giving away details that might negate the changes. Contrast that with Bing, which simply makes updates.

While you can generalize that a broad rule change in Google won’t hurt you with other search engines, it’s to the specifics that you need to pay attention (Bing Webmaster Guidelines, for instance). While some rules apply universally to all search engines, some don’t, so it’s important to know your search engines. If you see in your analytics that a large or growing share of your traffic is coming from Bing, for instance, you need to make sure you don’t set up roadblocks to being found there. A “roadblock” is something that interferes with indexing, including not meeting their guidelines and blocking access to your site.

Both Google and Bing recommend unique title tags and description tags for each Web page. However, Google only says to keep them brief. Bing gives specific limits in its guidelines. If you’re targeting Bing, the limit is 65 characters for titles and 160 characters for descriptions.

General Changes to Search

Some general changes in 2013 that apply to all search engines might be:

  • Links for the sake of links can be a bad thing. Focus on links that apply to your content.
  • Using too many keywords looks even more like keyword stuffing.

Technorati suggested no more than three uses of a keyword on a page. However, search engines would counsel, write naturally.

I said “might be” above. That’s because these rules have always been under close scrutiny by search engines. They are just getting more enforcement among more search engines now.

A forum earlier in the year noticed that Bing and Yahoo were updating their algorithm for links to give an edge to branded domain names. That gives the brand owner the authority of the link, not you the website linking to it.

A contributor on Search Engine Land focusing on maintaining links noted more relevance was being given to links to interior pages instead of home pages. In fact, if you check the links to your site and there is a higher percentage going to your home page than to interior pages, it suggests your content is of less quality. It pays to maintain your links!

Making the Differences Between Search Engines Work for You

Most businesses – and some SEO practitioners – have decided that since Google is the dominant search engine, they should just optimize for Google. But that’s ignoring huge swaths of the marketplace. Bing is working hard at being competitive and its share of search is steadily growing at Google’s expense. Their recent advertising blitz comparing their search results against Google’s is making a difference.
Here’s how you can make the differences between search engines work for you.

  • Roost on your site analytics and watch where your traffic originates. There are dozens of search engines, some of them focusing on unique markets, which may be searching for your site. Don’t lose traffic because “Google is dominant in search.” Here are the current 15 most popular search engines. If you have a specialty site, some of these smaller search engines may help you increase traffic.
  • Register with Webmaster Tools on Bing as well as Google and watch for things like Link Alerts (Google) and Crawl Error Alerts (Bing) for opportunities to improve site performance. (Yahoo uses Bing Webmaster Tools and Ask uses Google Webmaster Tools.) Follow their guidelines – note the similarities, and if traffic flows from a particular search engine, note and make use of the discrepancies.
  • With recent revelations of search engine compliance with NSA snooping into citizen Internet use, there has been noticeable growth in smaller, more secure search engines like StartPage and Ixquick. Remember to consider their effect in your traffic increases or decreases. Note: StartPage gets search results from Google; Ixquick sources its results from multiple search engines.

The truth remains: If you write quality content for the reader, you are unlikely to run into problems with any search engine. Optimization remains the art and science of showing up in a search on any search engine.

Filed Under: Seo Tagged With: algorithm updates, algorithms, analytics, Ask, Bing, differences between search engines, Google, increase traffic with seo, Ixquick, search, search engines, search industry, seo, seo 2013, site analytics, StartPage, Webmaster, Webmaster Guidelines, Webmaster Tools, Yahoo

Free Email Updates

Receive free updates on the latest in digital marketing.

About Content Trends

Content Trends covers the entire spectrum of content marketing. Get tips sent to your inbox on content marketing strategy, content writing, SEO, blogging and social media. Learn more

Copyright © 2025 · Metro Pro on Genesis Framework · WordPress · Log in