• About
  • Services
  • Portfolio
  • Contact

Content Trends

Content Marketing, SEO and Digital Marketing

  • Home
  • Why Content Marketing?
  • SEO
  • Blogging Tips
  • Social Media
  • Newsletter

How Your Content Is Hurting You, and What You Can Do about It

August 16, 2013 by Alan Eggleston

seo content writing
You are most likely, SEO-savvy and using “white hat” SEO techniques and not losing rank because search engines are penalizing you for using “black hat” SEO techniques. And perhaps you know a little about optimization but not enough to write content that keeps you in the top rankings.

Here are 8 top reasons why your content is not making it to the top and easy remedies to fix these common mistakes.

#1. Lack of relevancy for your keyword

One of the most important factors for determining search ranking is relevancy. Is your content relevant to your keywords?
Every page must have a unique set of meta data and links that create that sense of relevancy. Many websites lack both.

  • Meta data relevancy – is your content relevant to the title tag and the description tag you placed on the page? The closer the tags relate to the words used by the searcher and the earlier you use them – in that same order – in the content on the page, the more relevancy your page has.
  • Link relevancy – are your links relevant to your content and keywords? The anchor text for links once were also key markers , but now a mix of keyword-rich anchor text and ever more general anchor text linking to still relevant content are more important.

Remedy: Improve content quality with better research, more data, using more links. No one is coming to your site to see you BS your way to a ranking. They want information. Make it your information, written in your voice, with your style and in your tone. Provide links to back up material.

#2. Low keyword use

This probably isn’t as big a problem as keyword stuffing. However, if you use the keyword only once or twice, say at the top of the page, and then ignore it thereafter, you’re probably committing this sin. Don’t over use keywords, but don’t under use them, either. They help make the page relevant.
Remedy: Make your topic clear. Mention your keyword a few times for clarity.

# 3. Irrelevant keyword use (keyword-stuffing)

This is usually a “black hat” SEO trick – stuffing the page with keywords. But sometimes it’s inadvertent. Sometimes it’s from an over abundance of caution. One writer on Technorati suggests we shouldn’t use a keyword more than three times, but search engines would suggest you use it only often enough to serve the reader well – not the search engines. And if you have a high word count, it may make more sense to use it more often than if you have a low word count. Use common sense.

Another sense of “irrelevant keyword use” also comes in the form of trying to fit in all the different varieties of a keyword, just in case someone uses them. Search engines usually differentiate various forms of a word to account for it in a search. Forcing words unnaturally into your content sets off alarms at search engines.
Remedy

  • Make better matches between keywords and meta data and meta data placement
  • better quality links
  • better variety of anchor text for links
  • Focus keyword use

#4. Low quality content

Search engines insist, “Write for the reader, not for the search engines.” What pleases the reader more than finding a treasure trove of information? Written in a format that makes it easy to pull out the data.

Poorly written, poorly spelled, poorly constructed content with little value is hard work for the reader and not at all pleasing search engines. Google Panda was also created to weed out low-quality content and that penalty will send your ranking south.
Remedy: Always, aim for high quality content, ie: content that provides value. Try to find topics that have not been indexed by search engines before and that will give you a competitive advantage in rankings. The key is to find a popular topic that has been covered extensively and give your own unique twist to it.

#5. Low word count

Low word count can be one sign of “thin content,” which could trigger the Google Panda penalty.
Longer content helps a search engine determine relevancy. If you provide fewer than 250 words, you may have a problem, although the quality of the text is far more important.
Some websites think shorter word counts are better: “People don’t want to read.” But that’s not true. Readers don’t want to wade through useless text to find value. Shorter sentences and shorter paragraphs aid reader scanning, while meatier content provides them more information – what they really want.
Remedy: A home page under 250 words doesn’t tell the reader much. A blog article of 300-400 words may not provide enough depth. 500-1000 words is a great goal, but write for quality and your audience.

#6. Scraped content (content lifted from other sources)

There is no value to reposting another’s material and you shouldn’t be rewarded for it. It’s lazy publishing, it’s plagiarism, and it’s unethical. You can certainly make “fair use” of short bits of other peoples’ work as a springboard to creating your own larger work, but literal picking up someone else’s work is wrong.
Remedy: Use only original content; use canonical tags in your own content to identify its originality.

#7. Duplicate content

Similarly, running your same material in multiple places on the Internet is wrong. There is a specific penalty for duplicating content. Even replacing a few words here and there doesn’t fool search engines.
Remedy: Don’t duplicate, rewrite! Cover news style with a capsule and link to the original story.

#8. Content you may not generate yourself but may affect your ranking

  • Auto-generated text (robotic fluff you sometimes see in comments): Seemingly random sets of words that don’t quite seem to make sense accompanied by strange looking URLs. It’s garbage meant to fool spam filters.
  • User-generated spam (comment or forum spam): This is often more sensible text and often written to appeal to your vanity, but sometimes contains spam keywords and certainly spam links. Occasionally, the links lead to OK pages but those pages then link to spam pages, which can negatively affect your ranking.

Remedy:

  • Monitor the comments and if something seems odd about a comment, don’t post it. More than likely it’s spam. Post guidelines about spam and police them. Spam and auto-generated text often make off-handed comments that have nothing to do with your topic – delete or send to the spam folder!
  • New: Google has just launched a new manual spam notification tool in Webmaster Tools to alert you when your site has been manually tagged for spam. Use it to reduce the effect of spam on your site.

Filed Under: Seo Tagged With: black hat SEO, duplicate content, keywords, link relevancy, low quality content, low word count, meta data relevancy, optimization, quality content, scraped content, search, search engines, seo, SEO techniques, white hat SEO, word count

There’s More to SEO in 2013 Than Google

August 4, 2013 by Alan Eggleston

In my last post for Web Content Blog, I discussed changes in SEO for 2013 focused on Google. That’s for two very good reasons. First, Google is dominant in search with over 66% of core searches. Second, Google talks about its changes while other search engines generally don’t. However, Bing broke through the 17% core search threshold for the first time this year – Search Engine Land, May 15, 2013.

The Search Industry

Because Google is dominant in search, it leads the rest of the pack in maintaining the technology. The others follow suit or make changes that help them market against Google. But occasionally, the others change their algorithms to suit their own business aims, and without alerting webmasters, it can send the industry into a tizzy. Bing, for instance, made three algorithm updates in 2012 and their forums lit up trying to figure out what was happening. To date in 2013, Bing has made only one algorithm update. Yahoo uses the Bing algorithm and so a change to the latter is a change to the former, although each tweaks its results to suit its needs.

There has always been a tension between search engines, which are naturally loyal to their search users, and webmasters on whom search engines rely for the sources of those searches. Search engines set the rules for how webmasters may run the gauntlet of a search engine indexing but have been little helpful in the navigation of the gauntlet.

Google is secretive with details about its updates, but it has communicated upcoming changes and it has answered questions, without giving away details that might negate the changes. Contrast that with Bing, which simply makes updates.

While you can generalize that a broad rule change in Google won’t hurt you with other search engines, it’s to the specifics that you need to pay attention (Bing Webmaster Guidelines, for instance). While some rules apply universally to all search engines, some don’t, so it’s important to know your search engines. If you see in your analytics that a large or growing share of your traffic is coming from Bing, for instance, you need to make sure you don’t set up roadblocks to being found there. A “roadblock” is something that interferes with indexing, including not meeting their guidelines and blocking access to your site.

Both Google and Bing recommend unique title tags and description tags for each Web page. However, Google only says to keep them brief. Bing gives specific limits in its guidelines. If you’re targeting Bing, the limit is 65 characters for titles and 160 characters for descriptions.

General Changes to Search

Some general changes in 2013 that apply to all search engines might be:

  • Links for the sake of links can be a bad thing. Focus on links that apply to your content.
  • Using too many keywords looks even more like keyword stuffing.

Technorati suggested no more than three uses of a keyword on a page. However, search engines would counsel, write naturally.

I said “might be” above. That’s because these rules have always been under close scrutiny by search engines. They are just getting more enforcement among more search engines now.

A forum earlier in the year noticed that Bing and Yahoo were updating their algorithm for links to give an edge to branded domain names. That gives the brand owner the authority of the link, not you the website linking to it.

A contributor on Search Engine Land focusing on maintaining links noted more relevance was being given to links to interior pages instead of home pages. In fact, if you check the links to your site and there is a higher percentage going to your home page than to interior pages, it suggests your content is of less quality. It pays to maintain your links!

Making the Differences Between Search Engines Work for You

Most businesses – and some SEO practitioners – have decided that since Google is the dominant search engine, they should just optimize for Google. But that’s ignoring huge swaths of the marketplace. Bing is working hard at being competitive and its share of search is steadily growing at Google’s expense. Their recent advertising blitz comparing their search results against Google’s is making a difference.
Here’s how you can make the differences between search engines work for you.

  • Roost on your site analytics and watch where your traffic originates. There are dozens of search engines, some of them focusing on unique markets, which may be searching for your site. Don’t lose traffic because “Google is dominant in search.” Here are the current 15 most popular search engines. If you have a specialty site, some of these smaller search engines may help you increase traffic.
  • Register with Webmaster Tools on Bing as well as Google and watch for things like Link Alerts (Google) and Crawl Error Alerts (Bing) for opportunities to improve site performance. (Yahoo uses Bing Webmaster Tools and Ask uses Google Webmaster Tools.) Follow their guidelines – note the similarities, and if traffic flows from a particular search engine, note and make use of the discrepancies.
  • With recent revelations of search engine compliance with NSA snooping into citizen Internet use, there has been noticeable growth in smaller, more secure search engines like StartPage and Ixquick. Remember to consider their effect in your traffic increases or decreases. Note: StartPage gets search results from Google; Ixquick sources its results from multiple search engines.

The truth remains: If you write quality content for the reader, you are unlikely to run into problems with any search engine. Optimization remains the art and science of showing up in a search on any search engine.

Filed Under: Seo Tagged With: algorithm updates, algorithms, analytics, Ask, Bing, differences between search engines, Google, increase traffic with seo, Ixquick, search, search engines, search industry, seo, seo 2013, site analytics, StartPage, Webmaster, Webmaster Guidelines, Webmaster Tools, Yahoo

Nothing Much in SEO Has Changed in 2013 – and Here’s Why

July 26, 2013 by Alan Eggleston

Some SEO practitioners look on search engine updates like Google Panda and Google Penguin introduced in 2011 and 2012 respectively, as nuclear option penalty devices. An SEO guy is just doing his job, after all, trying to get his clients noticed in a competitive field. A nip here, a tuck there, ignore some rules over here, wink at a search engine tut-tut back there, and who gets hurt? Search engines don’t see it that way: All the nips and tucks break their rules.

Search Engines are hardly proactive, at least until recently. All their algorithm changes have been reactionary – reacting to SEO practitioners bending the rules or ignoring them altogether to rank ahead of the field in searches. Google Panda (meant to fight thin or low-quality content) and Google Penguin (meant to fight spam) were major adjustments to rein in SEO game playing. And it seems to have worked, because Google has made few other major changes in 2013. Here is a list of Google’s other reactions to-date in 2013.

Google’s 2013 Effort to Tame SEO

#1. Continued minor updates of Google Penguin (to reduce spam) and Google Panda (to reduce thin content).

Panda most recently was a much milder update than in the past and some sites have reported recovering ranking recently.

#2.Loss of relevance of keyword links (anchor text).

The practice once was to make sure you used keywords in anchor text, but Google says that’s no longer as important. They are attaching more relevance to more natural language – write the content naturally and add then add the links. It’s OK to use referral language or even as simple as “click here.” Google is also connecting sites through brand names that aren’t even linked, so keyword linking between brands isn’t always necessary.

#3. Fewer SERPS results per domain name per keyword phrase.

Fewer SERPS per domain name means a business that was used to their website ranking multiple times under the same keyword or keyword phrase will now only rank up to four times. For example, let’s say your domain name is pencils.com and your keyword phrase is 2 lead (as in #2 lead pencil) and before the change, you had SERPS six times for different colored lead or different pack sizes. Now you will only get SERPS for four times for those same colors or packs. Thus, the domain name remains just as relevant, but the number of rankings is lower. Google is trying to reduce the dominance of some players in rankings while making room for others who because of those dominant players were crowding the field.
Thinning out some of the same-domain-name results may help bring up some of the deeper rankings. It’s both a benefit and a bane, depending on how your website ranks.

#4.Refined Google authority algorithm to boost site authority on topics.

Topic or industry authority figures high in ranking and refining this algorithm should aid those with higher quality content and further penalize those with lower quality content. Again, the goal is to benefit the reader, and it also benefits the higher quality content producers.

#5.Less rhetoric on link building, but continued emphasis on link quality.

To reduce the effect of specious or unnatural links, Google is downplaying the number of links in favor of ensuring the links you do include have meaning. Links are still important, but you are rewarded for high quality links and not rewarded simply for having links.

Google 2013 Proactive Steps to Help Webmasters Maintain Links

Surprising SEO practitioners, Google did take two proactive steps that help webmasters manage optimization. They can help you maintain quality content, too.

#1. New “Disavow Links” feature to disavow inbound links.

Acknowledging that competitors and spammers may create links to your site that detract from your ranking, Google now allows you to “disavow” links. Make use of this feature among others in Google’s Webmaster Tools.

#2. New “Link Alerts” feature to help for maintenance of bad links.

Another new feature Google has added to work with websites is “Link Alerts.” When Google discovers problem links, they now alert you through Webmaster Tools and provide example URLs. Another feature on Webmaster Tools is the ability to track all the links to and on your site, which makes following up on Link Alerts easier.

Pleasing the Reader Pleases the Search Engines – Still in 2013

What I have found is that it’s best to ignore all the hype surrounding SEO (Search Engine Optimization, or, optimizing for search engines) and personally work on Website Optimization (optimizing a website for search, or, removing roadblocks and optimizing reader opportunities within my pages). I try to write quality content that – from the beginning – makes the content clear to the reader. When it’s clear to the reader, it will also be clear to the search engines In that respect, nothing much has really changed in 2013.

Filed Under: Seo Tagged With: 2013 seo, panda, penguin, seo, seo 2013, seo updates

5 Ways to Measure the Popularity of Your Website

September 18, 2012 by Gazalla Gaya

Ever wonder how you stack up and compare to the billions of websites in cyberspace? Measuring your site’s popularity is not just a pursuit in vanity but is an excellent way to understand what measures you need to take to improve your site. Everyone in your web team benefits from knowing your site’s popularity metrics from your site’s webmasters to designers to marketers to the site’s owner/stakeholders.

  • Business owners of-course have the most to lose if their site is not popular in terms of sales, profits, dollars and reaching future prospects. If you are involved in selling anything online, the popularity of your site will determine your earnings. Researching your site’s popularity will give you a better feel on where you can focus your budget to obtain better future metrics.
  • If you are a webmaster, your site’s popularity metrics will tell you what you need to do to improve speed, metadata, make changes in code and site optimization.
  • The site’s content team needs to know how popular the site is so that they can adjust their content, make it more appealing to certain types of visitors and apply strategies to increase and further engage their audience.

#1. Alexa.com

Your Alexa ranking is a pretty good indicator of the popularity of your site. I always like to look at the global Alexa rank of a site, especially before I do business with them. Advertisers tend to look at these stats. If you are into any type of affiliate marketing you already know the importance of Alexa ranking as many affiliate marketers will do business with your blog based on your Alexa rank. Alexa will also give you other important statistics such as your rank in a particular country, traffic stats and back links info.

Formerly, it was essential to have an Alexa toolbar installed so that Alexa could track the number of visitors but in 2008, Alexa went through a major update and they claim that they no longer need to see an Alexa toolbar installed to calculate your global rank.

Alexa provides traffic data, global rankings and other information on thousands of websites, and claims that 6 million people visit its website monthly.

Alexa is good as a research tool as it also compares competitor sites and shows you:

  • How popular the site is compared to yours, including Reach, Pageviews and more.
  • Search Analytics that indicate which terms your competition is using to get traffic.
  • Audience data such as what kind of visitors your competition is attracting.
  • Clickstream data that indicate where your competition is getting traffic from affiliate programs and partners.

Alexa ranking
Alexa ranking for WordPress.com, the popular blogging platform

#2. Seo mozRank

mozRank is measured on a scale of 1 to 10 and is SEOmoz’s 10-point measure of link authority and popularity. It’s similar to the old Google Page Rank and is logarithmic. That means it’s ten times as hard to move from a 3 to a 4 as it is to move from a 2 to a 3. Google Page Rank is no longer an accurate measure. It’s updated very infrequently and new sites are not given an API Soap key that is used by Google to effectively determine Page Rank.

Each link back counts as a vote for your site and search engines often rank pages with higher global link authority ahead of pages with lower authority. Measures like mozRank are global and static, hence, this ranking power applies to a broad range of search queries, rather than pages optimized specifically for a particular keyword.

It’s a good idea to have the MozBar installed as you will see your rank and also immediately know the rank of any site you visit. OpenSiteExplorer, another popular tool by Seomoz also gives you your mozRank(mR) along with other factors such as page authority(PA) and domain authority(DA).

#3. Marketing Grader tool by Hubspot

Are you doing enough to bring visitors to your website and fill the top of your sales and marketing funnel? How do you do when it comes to converting traffic into leads and leads into customers? Do you know what marketing activities are working?

I like Hubspot’s Marketing Grader tool because it points out exactly my areas of improvement based on reviewing over 30 factors and then providing me with an overall marketing grade on a 0-100 scale.

The Marketing Grader tool studies over 30 different parameters including popularity metrics such as:

  • Are your marketing efforts generating sales and leads?
  • Is your blog driving results that justify the time investment or are you wasting time doing the wrong things?
  • Your mozRank and your Klout score
  • Your analytics and monthly web traffic
  • How effectively are you using social media to drive traffic to your website?
  • What are the strong points and suggestions for improvement in your marketing strategy?

Social Search
Maketing Grader report for Web Content Blog showing me different areas for improvement

#4. Analytics

Analytics software such as Google Analytics, Omniture analyse different metrics such as:

  • Number of page views, visitors, pages per visit, new visitors vs. returning visitors
  • Most popular posts based on number of page views
  • Most shares on social media
  • Other metrics such as demographics – visitors by geographic location, language spoken etc.
  • Metrics that measure engagement such as time on site and bounce rate.

#5. Social Media

In the current climate, your website’s popularity is also gauged by the number of times your site’s content is shared on social media. Your site is naturally more popular the more times it is shared on social media. Each share, like, tweet, reddit is counted by search engines as a vote for your site’s content and social shares are an important way for search engines to determine the popularity of your website.

Sites such as Topsy do a good job of showing which post/page of your website content got the most shares. I also like url shorteners such as bit.ly that tell you how many times a piece of content was shared and by whom. I also like tweetmeme which shows you the hottest links on Twitter. Social Overview and Social Sources section of your Google Analytics are also pretty strong indicators of your site and content’s popularity. Of-course you can also get various paid solutions such as Radian6 to monitor social media analytics.

If your website has a blog, there are certain other indicators such as your site’s RSS feed subscriptions and number of email subscribers that that give you an idea on your blog’s popularity. Even though fewer people subscribe to RSS feeds and RSS feeds are mainly subscribed to by readers who are already technically savvy, you could get some idea by checking your Feedburner stats or by checking your RSS feed stats through Google Reader. The number of email subscribers and also the type (profession etc.) of email subscribers could give you an indication of which niche market your blog is best aimed at.

How about you? Which tools do you use to measure the popularity of your website and also why do you measure popularity? to gain more business? write better content? Please share in the comments below. Thanks.

You May Also Like:

7 Essential SEO Tools for All Content Writers
5 Content guidelines that Attract Customers and Search Engines
10 Free SEO Tools

Filed Under: Seo Tagged With: alexa, global rank, hubspots market grader tool, measure site's popularity, mozrank, ranking of site, seo

5 Content Guidelines to Attract Prospects and Search Engines

August 18, 2012 by Gazalla Gaya

Google’s Panda and Penguin updates shook up the the SEO community and more updates are on the way. In fact, according to Search Engine Roundtable, Matt Cutts said that the next few Penguin updates will be jarring and jolting for SEO’s. Cutts was speaking at a keynote address at SES, San Fransisco, on August 15, 2012.

Google Doesn’t Hate SEO

As he has in the past, Cutts pointed out that Google doesn’t hate SEO. The goal of SEO is to make websites more crawlable and faster. When SEO becomes an issue is when spam comes into play, such as if you go overboard buying links, doing comment spam links, or keyword stuffing.

The fact is that Google makes all these updates to ensure that search results are relevant to the user. Google is primarily interested in relevancy, context and quality of content. If you write your content primarily to attract your prospects while keeping search engines in mind so that they can easily index that content, you’ve created a win-win situation. As long as users find your content useful, easy to navigate, scan and understand you’ve made the job easy for both prospects and search engines.

Here are a few content guidelines that Google will not ignore since they entice the user, are relevant and contextual. They allow the user to quickly scan your copy and at the same time they allow search engines to efficiently index your content.

#1.Unique and Engaging Content

Prospects are automatically attracted to unique and original content. Posting unique and high quality content is also a seo strategy. It is the most important and effective way to receive good rankings for several reasons:

  • When your content is unique, other sites will start linking to you. That’s automatic link building which is an essential seo strategy.
  • The very reason Panda and Penguin were born were to eliminate spammy sites, duplicate content, obvious keyword stuffing and to reward original, high quality content. Posting unique and high quality content has been endorsed by Google and other search engines in all their seo guides.
  • If your content is unique, people are bound to share. Most search engines account for social shares as an indicator of the quality of your site. Your social vote increases your visibility and the end result is higher rankings.

#2. Use of Proper Heading Tags

Headlines that engage the reader are instantly successful. They attract and entice your prospect to read the rest of your copy. Successful headlines allow you to tell your story in a few powerful words.

At the same time, heading tags help search engines determine the structure of your document. Seo best practices for headings include:

  • Your H1 heading is your top level heading. It needs to inform search engines about the focus of your document.
  • Use headings and sub headings only to define the structure of your document. Avoid using headings only for styling purposes and if they do not define the structure of your content.
  • Use keywords in your headings, preferably, towards the beginning of your heading.
  • The H1 heading is the top level heading and should be used only once in your document. H2-H6 are the lower level headings that help you organize your content. Use h2 and h3 headings according to the importance of the sub-topic.

#3. Anchor text and links

Anchor text helps the reader quickly find more pertinent, relevant info either on your site or on another relevant useful site.

Seo Best practices include:

  • In the early days of the web, you could get away with anchor text such as: Click here for more info. These days you need to clearly define the purpose of your anchor text. Anchor text needs to be keyword rich. You also need to be subtle and sophisticated with your use of keywords. Use your keywords too many times and Google will think that you are a spammer. On the other hand, your anchor text also needs to inform search engines about the focus of your document. There’s just that fine balance that you need to strive for. Avoid keyword-stuffing and using the same keyword multiple times.
  • Make it easy for users to distinguish between regular text and your anchor text.

#4. Internal links

Your prospects like internal links as it helps them better navigate your site. Try to create each page on your site and each post in your blog as a small part of a larger story and you will be able to help readers navigate through the whole story with an efficient use of internal links.

Not surprisingly,search engines also like internal links as it helps them understand the internal structure of your pages. Seo best practices include linking to only relevant content within the site structure.

#5. Images

Images instantly attract readers and are also attractive bait to lure a reader into reading your page.
Here are some seo best practices for images:

  • Alt tags always allow a search engine to understand what your image is about.
  • Avoid naming files with generic names such as: image1.jpg. Follow a naming convention that is relevant to each image ex: smallcat.jpg
  • Store all images in a single directory. Search engines like this practice as it helps them understand the internal file structure.
  • Use an image compression program such as smush.it to compress your images. The less space they occupy, the quicker your site will load.
  • Provide search engines with an image sitemap file
  • Always give attribution. According to the Google’s recent update, just last week, sites that don’t give attribution to any form of content will be penalized. This usually happens with images.

If you create unique, interesting, powerful content, follow these content guidelines and seo best practices you will attract prospects and search engines, build your brand and increase your exposure through improved rankings.

Do you have nay more content guidelines to add? Please share in the comments below. Thanks.

You May Also Like
7 Essential SEO Tools for All Content Writers
How to Use the Magic of Meta Tags to Improve Rankings
Google gets Picky Over Content with Panda

Filed Under: Seo Tagged With: anchor text, content, content optimization, heading tags, images, internal links, optimizing content for google, search engine optimization, seo

  • 1
  • 2
  • Next Page »

Free Email Updates

Receive free updates on the latest in digital marketing.

About Content Trends

Content Trends covers the entire spectrum of content marketing. Get tips sent to your inbox on content marketing strategy, content writing, SEO, blogging and social media. Learn more

Copyright © 2025 · Metro Pro on Genesis Framework · WordPress · Log in