• About
  • Services
  • Portfolio
  • Contact

Content Trends

Content Marketing, SEO and Digital Marketing

  • Home
  • Why Content Marketing?
  • SEO
  • Blogging Tips
  • Social Media
  • Newsletter

There’s More to SEO in 2013 Than Google

August 4, 2013 by Alan Eggleston

In my last post for Web Content Blog, I discussed changes in SEO for 2013 focused on Google. That’s for two very good reasons. First, Google is dominant in search with over 66% of core searches. Second, Google talks about its changes while other search engines generally don’t. However, Bing broke through the 17% core search threshold for the first time this year – Search Engine Land, May 15, 2013.

The Search Industry

Because Google is dominant in search, it leads the rest of the pack in maintaining the technology. The others follow suit or make changes that help them market against Google. But occasionally, the others change their algorithms to suit their own business aims, and without alerting webmasters, it can send the industry into a tizzy. Bing, for instance, made three algorithm updates in 2012 and their forums lit up trying to figure out what was happening. To date in 2013, Bing has made only one algorithm update. Yahoo uses the Bing algorithm and so a change to the latter is a change to the former, although each tweaks its results to suit its needs.

There has always been a tension between search engines, which are naturally loyal to their search users, and webmasters on whom search engines rely for the sources of those searches. Search engines set the rules for how webmasters may run the gauntlet of a search engine indexing but have been little helpful in the navigation of the gauntlet.

Google is secretive with details about its updates, but it has communicated upcoming changes and it has answered questions, without giving away details that might negate the changes. Contrast that with Bing, which simply makes updates.

While you can generalize that a broad rule change in Google won’t hurt you with other search engines, it’s to the specifics that you need to pay attention (Bing Webmaster Guidelines, for instance). While some rules apply universally to all search engines, some don’t, so it’s important to know your search engines. If you see in your analytics that a large or growing share of your traffic is coming from Bing, for instance, you need to make sure you don’t set up roadblocks to being found there. A “roadblock” is something that interferes with indexing, including not meeting their guidelines and blocking access to your site.

Both Google and Bing recommend unique title tags and description tags for each Web page. However, Google only says to keep them brief. Bing gives specific limits in its guidelines. If you’re targeting Bing, the limit is 65 characters for titles and 160 characters for descriptions.

General Changes to Search

Some general changes in 2013 that apply to all search engines might be:

  • Links for the sake of links can be a bad thing. Focus on links that apply to your content.
  • Using too many keywords looks even more like keyword stuffing.

Technorati suggested no more than three uses of a keyword on a page. However, search engines would counsel, write naturally.

I said “might be” above. That’s because these rules have always been under close scrutiny by search engines. They are just getting more enforcement among more search engines now.

A forum earlier in the year noticed that Bing and Yahoo were updating their algorithm for links to give an edge to branded domain names. That gives the brand owner the authority of the link, not you the website linking to it.

A contributor on Search Engine Land focusing on maintaining links noted more relevance was being given to links to interior pages instead of home pages. In fact, if you check the links to your site and there is a higher percentage going to your home page than to interior pages, it suggests your content is of less quality. It pays to maintain your links!

Making the Differences Between Search Engines Work for You

Most businesses – and some SEO practitioners – have decided that since Google is the dominant search engine, they should just optimize for Google. But that’s ignoring huge swaths of the marketplace. Bing is working hard at being competitive and its share of search is steadily growing at Google’s expense. Their recent advertising blitz comparing their search results against Google’s is making a difference.
Here’s how you can make the differences between search engines work for you.

  • Roost on your site analytics and watch where your traffic originates. There are dozens of search engines, some of them focusing on unique markets, which may be searching for your site. Don’t lose traffic because “Google is dominant in search.” Here are the current 15 most popular search engines. If you have a specialty site, some of these smaller search engines may help you increase traffic.
  • Register with Webmaster Tools on Bing as well as Google and watch for things like Link Alerts (Google) and Crawl Error Alerts (Bing) for opportunities to improve site performance. (Yahoo uses Bing Webmaster Tools and Ask uses Google Webmaster Tools.) Follow their guidelines – note the similarities, and if traffic flows from a particular search engine, note and make use of the discrepancies.
  • With recent revelations of search engine compliance with NSA snooping into citizen Internet use, there has been noticeable growth in smaller, more secure search engines like StartPage and Ixquick. Remember to consider their effect in your traffic increases or decreases. Note: StartPage gets search results from Google; Ixquick sources its results from multiple search engines.

The truth remains: If you write quality content for the reader, you are unlikely to run into problems with any search engine. Optimization remains the art and science of showing up in a search on any search engine.

Filed Under: Seo Tagged With: algorithm updates, algorithms, analytics, Ask, Bing, differences between search engines, Google, increase traffic with seo, Ixquick, search, search engines, search industry, seo, seo 2013, site analytics, StartPage, Webmaster, Webmaster Guidelines, Webmaster Tools, Yahoo

Nothing Much in SEO Has Changed in 2013 – and Here’s Why

July 26, 2013 by Alan Eggleston

Some SEO practitioners look on search engine updates like Google Panda and Google Penguin introduced in 2011 and 2012 respectively, as nuclear option penalty devices. An SEO guy is just doing his job, after all, trying to get his clients noticed in a competitive field. A nip here, a tuck there, ignore some rules over here, wink at a search engine tut-tut back there, and who gets hurt? Search engines don’t see it that way: All the nips and tucks break their rules.

Search Engines are hardly proactive, at least until recently. All their algorithm changes have been reactionary – reacting to SEO practitioners bending the rules or ignoring them altogether to rank ahead of the field in searches. Google Panda (meant to fight thin or low-quality content) and Google Penguin (meant to fight spam) were major adjustments to rein in SEO game playing. And it seems to have worked, because Google has made few other major changes in 2013. Here is a list of Google’s other reactions to-date in 2013.

Google’s 2013 Effort to Tame SEO

#1. Continued minor updates of Google Penguin (to reduce spam) and Google Panda (to reduce thin content).

Panda most recently was a much milder update than in the past and some sites have reported recovering ranking recently.

#2.Loss of relevance of keyword links (anchor text).

The practice once was to make sure you used keywords in anchor text, but Google says that’s no longer as important. They are attaching more relevance to more natural language – write the content naturally and add then add the links. It’s OK to use referral language or even as simple as “click here.” Google is also connecting sites through brand names that aren’t even linked, so keyword linking between brands isn’t always necessary.

#3. Fewer SERPS results per domain name per keyword phrase.

Fewer SERPS per domain name means a business that was used to their website ranking multiple times under the same keyword or keyword phrase will now only rank up to four times. For example, let’s say your domain name is pencils.com and your keyword phrase is 2 lead (as in #2 lead pencil) and before the change, you had SERPS six times for different colored lead or different pack sizes. Now you will only get SERPS for four times for those same colors or packs. Thus, the domain name remains just as relevant, but the number of rankings is lower. Google is trying to reduce the dominance of some players in rankings while making room for others who because of those dominant players were crowding the field.
Thinning out some of the same-domain-name results may help bring up some of the deeper rankings. It’s both a benefit and a bane, depending on how your website ranks.

#4.Refined Google authority algorithm to boost site authority on topics.

Topic or industry authority figures high in ranking and refining this algorithm should aid those with higher quality content and further penalize those with lower quality content. Again, the goal is to benefit the reader, and it also benefits the higher quality content producers.

#5.Less rhetoric on link building, but continued emphasis on link quality.

To reduce the effect of specious or unnatural links, Google is downplaying the number of links in favor of ensuring the links you do include have meaning. Links are still important, but you are rewarded for high quality links and not rewarded simply for having links.

Google 2013 Proactive Steps to Help Webmasters Maintain Links

Surprising SEO practitioners, Google did take two proactive steps that help webmasters manage optimization. They can help you maintain quality content, too.

#1. New “Disavow Links” feature to disavow inbound links.

Acknowledging that competitors and spammers may create links to your site that detract from your ranking, Google now allows you to “disavow” links. Make use of this feature among others in Google’s Webmaster Tools.

#2. New “Link Alerts” feature to help for maintenance of bad links.

Another new feature Google has added to work with websites is “Link Alerts.” When Google discovers problem links, they now alert you through Webmaster Tools and provide example URLs. Another feature on Webmaster Tools is the ability to track all the links to and on your site, which makes following up on Link Alerts easier.

Pleasing the Reader Pleases the Search Engines – Still in 2013

What I have found is that it’s best to ignore all the hype surrounding SEO (Search Engine Optimization, or, optimizing for search engines) and personally work on Website Optimization (optimizing a website for search, or, removing roadblocks and optimizing reader opportunities within my pages). I try to write quality content that – from the beginning – makes the content clear to the reader. When it’s clear to the reader, it will also be clear to the search engines In that respect, nothing much has really changed in 2013.

Filed Under: Seo Tagged With: 2013 seo, panda, penguin, seo, seo 2013, seo updates

Effective Ways to Improve Your Alexa Rankings and Increase Your Traffic

December 15, 2012 by Erik Emanuelli

There are many reasons why you might be interested in improving your Alexa Rank, and also increasing your website traffic.

But what is exactly Alexa?

As you probably know Alexa is a company that takes care of keeping track of the most visited sites in the world.

For each website is assigned a score relative to its position in the ranking of the most visited sites. Google is number 1 and this means that it is the most visited online place in the world. And so on.

Each score is unique, which means that there are no two sites with the same score.

In order to judge the sites around the web, Alexa ranking is based on a toolbar that is installed by millions of people around the world, which sends information (anonymously) about what sites are visited by each user.

The system is quite vulnerable because many do not have this toolbar installed and above all this habit is more marked in some countries rather than others.

Another vulnerability is due to the fact that the sites dedicated to webmasters – for instance WordPress guides, news about tech, web design blogs – usually have a very high ranking and the reason is that at the visitors of these kind of websites often have installed the toolbar. Of course this increases their rankings.

In order to correct these vulnerability factors, since at least 2 years, Alexa no longer depends solely upon its toolbar to gauge traffic stats and it uses several different sources to gather that information.

I am really curious about which traffic sources are considered by Alexa, and unfortunately like the Google algorithm, the secret is well kept.

More Quality Content

Common Tips to Increase Your Alexa Rank

1 . Install the toolbar

It is perhaps the more obvious operation to do : install the Alexa Toolbar. Any time you visit your website, you contribute to increase your rankings. Every visit is considered once every 24 hours, so do not continue to refresh the page in order to increase your rankings, as it is useless.

2 . Suggest Your Readers to Install the Alexa Toolbar

You can add a link in your website where your readers can easily download the Alexa Toolbar. You should then invite your users to get it. If you have a newsletter, this could be a way to communicate with your audience. Even if only 20-30 loyal users will install the toolbar in their browsers, this will help a lot in improve your Alexa rankings.

3 . Add the Alexa Widget in Your Website

Unlike what you may read in some articles, adding the Alexa Widget in your blog does not automatically increase your rank, but it gives some more information about your site numbers and invite your users to click on it. For every redirected visit, it’s an improvement of your ranking.

4 . Write Some Reviews on Alexa

I am sure you have some favorite blogs you read with pleasure. So why not write a review on Alexa of these sites? Let the blog owners know you wrote a review, and they will probably exchange the favor. Also writing a review in one of the top 100 rated Alexa sites and then include your URL, could be a method to improve your rank.

5 . Get Some Social Media Traffic

Submit your best posts to popular social media sites like Facebook, StumbleUpon or Reddit : this will bring a lot of traffic to your blog and will contribute to improve your Alexa rank. 6 . Write Webmasters Related Content

Many webmasters have the Alexa toolbar installed. You want to write interesting posts related to SEO, online marketing or webmaster tools and guides. You can then share these articles on webmaster forums and social networking sites.

More Content? More Conversions and Visits by Users!

You already know: the most important part of any online project is made up of the content. Better to say, the most important rule is the creation of high quality content.

It is also known that, in addition to quality, quantity of content is also important. This is important primarily in a SEO point of view. In fact, if your website has a lot of content, you will have more chances to be well positioned on search engines, for a variety of reasons – and keywords. Your visibility will be so much greater than those who, on equal terms, creates less content than you. An analysis published by HubSpot Blog shows us some data on how the amount of content has a positive impact on a website. As mentioned above, if you have a site with lots of content, you have a better chance of getting more visibility and conversions.

Specifically, here’s what the study showed:

  • Content shared on social media sites is more successful and popular;
  • Websites with a greater number of indexed pages – that is, with a greater number of content created – are more likely to get inbound links;
  • Websites with more content have more conversions. It is yet another proof that creating content in quantity – especially when it comes to quality content – gives a nice competitive advantage to your website in relation to your competitors.
  • Below, you see a graph from HubSpot that summarize the findings of the research.

This is a cropping of the infographic you find here.

Google Indexed pages vs. Alexa rank

Do you agree with this graph? I did a test in my website, and this case study shows that publishing content daily will bring you quickly a great Alexa Rank (from 100k to 29k in 2 months).

What is your experience?

Do you care about Alexa?

Please share your opinion in the comments. Thanks! Image courtesy of Stuart Miles at FreeDigitalPhotos.net

Filed Under: Seo

5 Ways to Measure the Popularity of Your Website

September 18, 2012 by Gazalla Gaya

Ever wonder how you stack up and compare to the billions of websites in cyberspace? Measuring your site’s popularity is not just a pursuit in vanity but is an excellent way to understand what measures you need to take to improve your site. Everyone in your web team benefits from knowing your site’s popularity metrics from your site’s webmasters to designers to marketers to the site’s owner/stakeholders.

  • Business owners of-course have the most to lose if their site is not popular in terms of sales, profits, dollars and reaching future prospects. If you are involved in selling anything online, the popularity of your site will determine your earnings. Researching your site’s popularity will give you a better feel on where you can focus your budget to obtain better future metrics.
  • If you are a webmaster, your site’s popularity metrics will tell you what you need to do to improve speed, metadata, make changes in code and site optimization.
  • The site’s content team needs to know how popular the site is so that they can adjust their content, make it more appealing to certain types of visitors and apply strategies to increase and further engage their audience.

#1. Alexa.com

Your Alexa ranking is a pretty good indicator of the popularity of your site. I always like to look at the global Alexa rank of a site, especially before I do business with them. Advertisers tend to look at these stats. If you are into any type of affiliate marketing you already know the importance of Alexa ranking as many affiliate marketers will do business with your blog based on your Alexa rank. Alexa will also give you other important statistics such as your rank in a particular country, traffic stats and back links info.

Formerly, it was essential to have an Alexa toolbar installed so that Alexa could track the number of visitors but in 2008, Alexa went through a major update and they claim that they no longer need to see an Alexa toolbar installed to calculate your global rank.

Alexa provides traffic data, global rankings and other information on thousands of websites, and claims that 6 million people visit its website monthly.

Alexa is good as a research tool as it also compares competitor sites and shows you:

  • How popular the site is compared to yours, including Reach, Pageviews and more.
  • Search Analytics that indicate which terms your competition is using to get traffic.
  • Audience data such as what kind of visitors your competition is attracting.
  • Clickstream data that indicate where your competition is getting traffic from affiliate programs and partners.

Alexa ranking
Alexa ranking for WordPress.com, the popular blogging platform

#2. Seo mozRank

mozRank is measured on a scale of 1 to 10 and is SEOmoz’s 10-point measure of link authority and popularity. It’s similar to the old Google Page Rank and is logarithmic. That means it’s ten times as hard to move from a 3 to a 4 as it is to move from a 2 to a 3. Google Page Rank is no longer an accurate measure. It’s updated very infrequently and new sites are not given an API Soap key that is used by Google to effectively determine Page Rank.

Each link back counts as a vote for your site and search engines often rank pages with higher global link authority ahead of pages with lower authority. Measures like mozRank are global and static, hence, this ranking power applies to a broad range of search queries, rather than pages optimized specifically for a particular keyword.

It’s a good idea to have the MozBar installed as you will see your rank and also immediately know the rank of any site you visit. OpenSiteExplorer, another popular tool by Seomoz also gives you your mozRank(mR) along with other factors such as page authority(PA) and domain authority(DA).

#3. Marketing Grader tool by Hubspot

Are you doing enough to bring visitors to your website and fill the top of your sales and marketing funnel? How do you do when it comes to converting traffic into leads and leads into customers? Do you know what marketing activities are working?

I like Hubspot’s Marketing Grader tool because it points out exactly my areas of improvement based on reviewing over 30 factors and then providing me with an overall marketing grade on a 0-100 scale.

The Marketing Grader tool studies over 30 different parameters including popularity metrics such as:

  • Are your marketing efforts generating sales and leads?
  • Is your blog driving results that justify the time investment or are you wasting time doing the wrong things?
  • Your mozRank and your Klout score
  • Your analytics and monthly web traffic
  • How effectively are you using social media to drive traffic to your website?
  • What are the strong points and suggestions for improvement in your marketing strategy?

Social Search
Maketing Grader report for Web Content Blog showing me different areas for improvement

#4. Analytics

Analytics software such as Google Analytics, Omniture analyse different metrics such as:

  • Number of page views, visitors, pages per visit, new visitors vs. returning visitors
  • Most popular posts based on number of page views
  • Most shares on social media
  • Other metrics such as demographics – visitors by geographic location, language spoken etc.
  • Metrics that measure engagement such as time on site and bounce rate.

#5. Social Media

In the current climate, your website’s popularity is also gauged by the number of times your site’s content is shared on social media. Your site is naturally more popular the more times it is shared on social media. Each share, like, tweet, reddit is counted by search engines as a vote for your site’s content and social shares are an important way for search engines to determine the popularity of your website.

Sites such as Topsy do a good job of showing which post/page of your website content got the most shares. I also like url shorteners such as bit.ly that tell you how many times a piece of content was shared and by whom. I also like tweetmeme which shows you the hottest links on Twitter. Social Overview and Social Sources section of your Google Analytics are also pretty strong indicators of your site and content’s popularity. Of-course you can also get various paid solutions such as Radian6 to monitor social media analytics.

If your website has a blog, there are certain other indicators such as your site’s RSS feed subscriptions and number of email subscribers that that give you an idea on your blog’s popularity. Even though fewer people subscribe to RSS feeds and RSS feeds are mainly subscribed to by readers who are already technically savvy, you could get some idea by checking your Feedburner stats or by checking your RSS feed stats through Google Reader. The number of email subscribers and also the type (profession etc.) of email subscribers could give you an indication of which niche market your blog is best aimed at.

How about you? Which tools do you use to measure the popularity of your website and also why do you measure popularity? to gain more business? write better content? Please share in the comments below. Thanks.

You May Also Like:

7 Essential SEO Tools for All Content Writers
5 Content guidelines that Attract Customers and Search Engines
10 Free SEO Tools

Filed Under: Seo Tagged With: alexa, global rank, hubspots market grader tool, measure site's popularity, mozrank, ranking of site, seo

Is Social Media the New SEO?

September 10, 2012 by Gazalla Gaya

“SEO is Dead” – was the the trending topic in one of my LinkedIn groups last month. This discussion was in response to a Forbes article, “The Death Of SEO: The Rise of Social, PR, And Real Content.”

The author, Ken Krogue, makes the case for social media being the new seo:

“Google used to think if you linked to someone on the Internet they must have valuable content. Now Google seems to believe that if you promote content with social media it is more indicative of relevant content and less likely to be faked. It is hardly about the links anymore, it’s about the metrics of engagement on your site.”

Does this argument hold true? Has social media so changed the seo landscape that social shares are most important and do other seo strategies even count?

To understand the rising importance of social media on search, let’s backtrack to two interesting developments to Search:

  1. The launch of Social Search (Content that is shared within your social networks is given priority) and
  2. The general inclusion of social signals in rankings (The amount of likes, shares, reddits etc. that an article receives has a direct impact on its rankings).

Google launched Social Search in 2009, as a feature that combines regular search results with publicly available data created by your friends’ social media activities. To use Google Social Search, simply do a search while logged in.

In Dec 2010, Danny Sullivan’s famous article, What Social Signals Do Google & Bing Really Count? caused a buzz in the SEO industry. He wrote about social signals and their importance in ranking an article. Social signals are signals that search engines look for from popular social media sites. They look at metrics such as how many people share a piece of content, how many tweets, likes, redditts etc. did a given piece of content get.

That same month, Matt Cutts confirmed that Google has incorporated social signals in ranking articles but is also considering the author who is posting the content. In other words, they give more weight to an influencer posting your content.
Here’s the video:

According to this video, we know that:

  • Google uses Twitter and Facebook links as signals in rankings. Social media likes, shares, tweets, reddits, and 1+’s (Googles obvious favorite) are regarded as votes for your site and your content.
  • The reputation of the author who is sharing the content is very important.
  • It’s currently (in Dec, 2010) used more in real-time search, but Google is looking to include it more broadly in web search as well
  • It only includes data that can be crawled and indexed. In other words private Facebook wall posts that cannot be crawled will not be indexed as opposed to Facebook fan pages that can be crawled.
  • Google ranks articles based on the quality of followers sharing your content. In other words they want to make sure that these are not just software bots but real people sharing your content.

Based on our own experience as authors and publishers of content, we know that social media plays a role in influencing search engine results by giving preference based on the authority of the author and the number of times a piece of content is shared on social networking sites. Social search is also more prevalent and no more just a part of real-time search with the new updates this year.

In early 2012, Social Search was given a major update. Google launched “Search, plus your World,” where it combined Google+ pages that were made public by its users with its regular search engine. You get personal results including photos uploaded by you or your connections, Google+ content, Google+ profiles and people and brands related to your search. Google +1’s are most easily indexed by Google and always show up prominently in search results.

The graphic below highlights a perfect example of Social Search in action. I follow Darren Rowse of Problogger and Ileane Smith of Basic Blog Tips. I typed in a search for “blogging” and their results showed up at the top of my search page:

Social Search
Social Search where people in my social network are given a priority in search results

Fast forward to June, 2012 and Matt Cutts conversation with Danny Sullivan in which he elaborated that even though social signals are important, backlinks are still important and count as a vote for your website.

Here is the video:

Danny Sullivan and Matt Cutts discuss SEO at SMX Advanced, June 2012
Here are the highlights of the conversation:

  • The web is comparable to space in it’s vastness and is currently the largest source of data in the world. For example, YouTube gets 72 hours of video uploaded every minute.
  • There is a perception in people’s minds that everything will go social which at this point is a premature assumption.
  • In Matt Cutts own words, “In 10 years things will be more social but I wouldn’t write the epitaph for links quite yet.”

Backlinks still very much count as a vote for your site. But as Cutts reiterated it’s not just any backlinks but backlinks from authority sites that count the most.

Google currently uses over 200 signals to rank a page including:

  • Page Rank
  • High quality and original content
  • Number of backlinks
  • Anchor text
  • Keywords
  • HTML title tags

No sooner did Google announce the importance of social shares, than spammers started flooding the internet with offers for facebook likes, twitter followers, etc. and now for the price of a few dollars, you can buy fake Facebook likes, twitter followers etc. Plus 1’s are a little harder as Google requires you to verify your account before you open. Just as black-hat seo’s in the past used various spammy backlinking techniques, these social media spammers have muddied the waters of social media popularity. As long as spammers and people trying to game the system exist, Google will not be able to use only one method for rankings.

Although social media is rapidly increasing as a way to understand relevance and context, we’ve heard from the horse’s mouth that it’s not the only way. The bottom line is that every seo strategy needs to account for a good social media policy while also keeping all the other strategies in mind such as gaining quality backlinks (through posting the best content, guest posting and commenting on other blogs), sound keyword research and optimizing your HTML tags and metadata.

Do you agree or disagree? Please share your thoughts in the comments below.

You May Also Like:
7 Essential SEO Tools for All Content Writers
How to Use the Magic of Meta Tags to Improve Your Rankings
5 Content Guidelines that Attract Prospects and Search Engines

Filed Under: Seo Tagged With: improve rankings with social shares, increase in social shares, matt cutts, seo and social media

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
  • Next Page »

Free Email Updates

Receive free updates on the latest in digital marketing.

About Content Trends

Content Trends covers the entire spectrum of content marketing. Get tips sent to your inbox on content marketing strategy, content writing, SEO, blogging and social media. Learn more

Copyright © 2025 · Metro Pro on Genesis Framework · WordPress · Log in