Error in deserializing body of reply message for operation 'Translate'. The maximum string content length quota (8192) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 8918.
This is a guest post by CT Moore.
You’ve heard the phrase before: Content is King. But the bigger the kingdom, the harder it is to maintain control over it. Well, if you run a content fiefdom, the same principle applies.
You see, being an enterprise level publisher is a double-edged SEO sword. On one hand, the more content you have, the more there is for Google to index, and the more short-, medium-, and long-tail keywords you can end up ranking for. On the other hand, there’s also more content an pages to confuse Google, meaning a greater chance of being seen as a spammer and getting penalized when you didn’t intend to violate any webspam guideline.
So how do enterprise level publishers get the most out of their content and avoid looking like spammers? Well, it all starts with some solid onsite SEO.
Online publishers are driven by three mandates: (1) reader acquisition, (2) converting those readers into repeat users, and (3) retaining those users so that they can continue to grow their user-base/audience. Well, SEO plays an integral part in both the acquisition and conversion processes.
Specifically, a comprehensive SEO strategy will ensure that:
onsite content/pages is optimized for targeted termsDuplicate Content does not become a problemThe first place that all SEO should start is onsite and on-page. Specifically, each page should have its own unique and targeted meta info. Each of the following elements are important:
Page Titles:By unique, I mean that no two pages should have the same Title or Meta Description. Every possible page, from articles to categories, should have different titles and descriptions that are optimized for slightly different keywords.
By targeted, I mean that you shouldn’t just stuff whatever keywords in there that you think are relevant. Rather, you should use tools like Google Adword’s Keyword Tool to determine what relevant keyword combinations have the highest search volumes and optimize your page around those. It could mean the different between targeting 1,000 searchers a month to targeting 100,000 searcher a month.
Duplicate content is treated as spam by search engines because they see it as taking two copies of the same page and trying to make them look like different pages so that you can (unfairly and inaccurately) increase your rankings on more sets of keywords. Duplicate content is a problem for dynamic publishers because, often, the same page can be called up through different criteria, such as categories and tags.
Dynamic publishing sites post two challenges for publishers. First, they confuse search engines so that they’re not sure which duplicated page to include in their index. Second, search engines might end up seeing it as spam, and penalizing or outright banning your site from the SERPs altogether.
On major publishing sites with multiple categories and tags (especially blogs), duplicate content can be common in three different places:
Index Page: if your index page features some latest or featured article/posts (like on a blog), then you’ll probably have some content overlap between your index page and other category pages.Categories/Tags: if you articles/posts can appear in multiple categories/tags, you will most certainly have duplicate content issues across several category/tags pages.Article/Post Pages: if your index, blog, category, and/or pages features the articles/posts in full, then you’ll have duplicate content problems between those articles/posts and the other places they appear in full on your site.There are five steps you can take to ensure that duplicate content issues do not affect your site’s indexation of rankings.
First of all, the only time where an artice/post should appear in its entirety is on the actual article/pst page. Any other page that might list that article (e.g. blog page, index page, category page, or tag page), shoud feature only a teaser from that article.
Ideally, your teasers should be completely unique, and not appear in the article itself. However, many large publishing sites choose to just feature the first 300 or so characters from the article, and it doesn’t seem to hurt their rankings.
As mentioned above, give every page a unique and targeted page title and meta description. This set of information is the first thing that search engines look at to determine what a page is all about, so this is your first opportunity to let search engines know exactly why any page is unique from another. This will be particularly important for category and tag pages, where content can be duplicated several times over.
The next place to let search engines know that a page is unique is through its content. So while two category/tag pages might list many of the same articles/posts/links, you can intervene by giving every page some unique static content that appears at the top.
There are two ways you should do this: (1) with unique H1 tag (hint: keep it related to your title tag), and (2) an additional descriptive paragraph that appears between that H1 tag and the content feed that may be duplicating content from other areas of your site.
If your site produce a lot of content across many categories and tags, it is probably not feasiable to produce unique page titles, meta descriptions, H1 tags, and intro paragraphs for every page — especially in the case of tags, which tend to be added on an ad hoc, ongoing basis. In this case, you will want to just tell search engines to ignore the really redundant pages and you can do this in two ways:
by adding to the page source of each redundant pageby adding these redundant pages to your robots.txt fileOf course, this begs the question of how do you know what pages are redundant? Well generally, it is best to exclude pages that are there for usability or navigation, but not search engines. As a rule, these tend to include:
tag pages (but keep category pages),author pages/feeds (unless you have high profile authors you want to rank for),archive by date pageFinally, many enterprise level publishers tend to syndicate content across many sites. This can results in another kind of duplicate content.
So what do you do if you wanna share relevant content with your users but not get penalized as spammer? Well, the answer is simple: use a canonical tag.
By adding a
tag to the page source of the page, you can tell the search engines “This content is duplicated and the original version is over there.” This way, the search engines know that you have nothing to hide. The canonical tag will look something like this:
You can get more info about the canonical tag here.
Whether we’re talking about business or something else, size is something that can work either for or against you. For instance, large companies have more resources, but are slower to adapt to changes in the marketplace. From an SEO perspective, enterprise level publishers have more content to be indexed, attract links, and help them rank on more terms, but the bigger the sitemap, the easier it is for search engines to get lost.
The right blend of onsite SEO, however, can make a huge difference when it comes to ranking on more terms and not getting in the process. What it really comes down to is ensuring that each page is as unique as possible (think page title, meta description, and H1 tag), and that there’s as little content duplicated in different places as possible. By taking these steps, not only can enterprise level publishers avoid penalties, but over time they will rank on longer and longer tail terms, and see an incredible amount of their organic traffic coming through on older pieces of content.
…………………………………………………………………………………………………………………CT Moore is a strategist the SEO services company NVI, which specializes in SEO, PPC, and Social Media campaigns. He is also an accomplished writer, blogger, and speaker, as well as a Staff Editor at Revenews.com
*Advertising Disclosure Policy*
No comments:
Post a Comment