Affiliate marketing, making money online can be a great way to generate a second income. But is a Home Based Business really for me? Pros/Cons- That's what this Blog is about
Promote shady CPA offers, which are consumers, included to fake testimonials or wrong products trying to make a sale, you could be sued personally. But not only that, the network that you promote could assets frozen and may pay to you all your commissions.
In has anti-corporate brought yet another case, the unethical CPA affiliates, the FTC, which use fake news sites, to promote products such as Acai. One of the companies is suing a popular affiliate network who know most affiliates. Note: If you are using this network, you can not pay because the FTC tries to freeze their assets. (History and links below.)
But 1st I wanted to point out that it not only the networks that will get busted, but are affiliates also. The Attorney General of Chicago sued only directly an affiliate marketer in the context of this FTC succeed.
Chicago Attorney General CU AFFILIATE for Acai scams
Chicago - Attorney General Lisa Madigan sued diet today a Chicago area man for marketing purposes fraudulent acai berry systems online, as part of a national police raid with the Federal Trade Commission against affiliate marketers, consumers to buy weight loss products by false news sites con.
The FTC had just a press conference here is the current press release.
The FTC complaints claim that typical fake news sites title such as "News 6 news alerts," have "Health news health warnings" or "health 5 beat health news." The sites are news, CBS, CNN often the names and logos of big media - such as ABC, Fox, United States today and consumer reports - and falsely represent convey that has seen the reports of the sites on these networks. An investigative sounding headline on one such site announced "acai berry diet exposed: miracle diet or scam?" The Sub reads, "as part of a new series: ' diet trends: a look at America's top diets we examine consumer tips for dieting during a recession." Have the article following purports to document, a reporter who experience complemented by acai berry - usually claiming, lost 25 pounds in four weeks.
"Almost everything about this sites forged,", said David Vladeck, Director of the FTC's consumer protection Bureau. "The weight loss results, the so-called investigations, the reporter, consumer recommendations and the attempt to represent an objective, journalistic endeavor."
The following companies have been called the FTC complaints:
The only name I recognize is interbank market media AKA COPEAC, a well known affiliate network.
According to the FTC press release on "the FTC seeks to permanently stop this misleading practice and asked courts to freeze the operations of assets pending trial." So I ask me, if she owed affiliate commissions pay can???
As I always say "Market with integrity and each WINS!" If you are marketing gets spammy, scammy offers and merchant broken and not pay you, then you've got what you deserve. Not only that, but you could be in trouble you get. It is simply not worth it!
I have long warned affiliates to stay away from gambling, RX and other risky affiliate programs that are, or could, be examined in the future Government.
On Friday the FBI seized and close the Web sites for pokerstars, full tilt poker UB.com and absolute poker sites, 4 of the largest online poker sites. The charges were reportedly bank fraud, money laundering and illegal gambling offense. Supposedly, not even customers can get their money due to the fact that the sites be shut down.
I think all of these companies had affiliate programs. They owe any of you money? I am curious to know if they should pay in able, affiliates, they money after the heavy fines. I suppose, have also frozen their bank accounts and other assets.
The end result? Sites are shuttering 76 bank accounts in 14 countries have. The parties are facing fines fines of up to $3 billion in money laundering. On top of that, all the defendants up to 5 years are confronted, and in some cases up to 30 years in prison as the maximum sentences for their alleged crimes.
I'm curious whether the company s mail to affiliates to the situation have sent. You hear anything affiliate, she write in the comments.
Error in deserializing body of reply message for operation 'Translate'. The maximum string content length quota (8192) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 9045. Error in deserializing body of reply message for operation 'Translate'. The maximum string content length quota (8192) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 8918.
You’ve heard the phrase before: Content is King. But the bigger the kingdom, the harder it is to maintain control over it. Well, if you run a content fiefdom, the same principle applies.
You see, being an enterprise level publisher is a double-edged SEO sword. On one hand, the more content you have, the more there is for Google to index, and the more short-, medium-, and long-tail keywords you can end up ranking for. On the other hand, there’s also more content an pages to confuse Google, meaning a greater chance of being seen as a spammer and getting penalized when you didn’t intend to violate any webspam guideline.
So how do enterprise level publishers get the most out of their content and avoid looking like spammers? Well, it all starts with some solid onsite SEO.
Online publishers are driven by three mandates: (1) reader acquisition, (2) converting those readers into repeat users, and (3) retaining those users so that they can continue to grow their user-base/audience. Well, SEO plays an integral part in both the acquisition and conversion processes.
Specifically, a comprehensive SEO strategy will ensure that:
onsite content/pages is optimized for targeted termsDuplicate Content does not become a problem
The first place that all SEO should start is onsite and on-page. Specifically, each page should have its own unique and targeted meta info. Each of the following elements are important:
Page Titles: Insert 65 CharactersMeta Descriptions: Meta Keywords: don’t bother because the big search engines stopped indexing this field in 2005, so all meta keywords do is tell your competitors what you’re trying to rank for.
By unique, I mean that no two pages should have the same Title or Meta Description. Every possible page, from articles to categories, should have different titles and descriptions that are optimized for slightly different keywords.
By targeted, I mean that you shouldn’t just stuff whatever keywords in there that you think are relevant. Rather, you should use tools like Google Adword’s Keyword Tool to determine what relevant keyword combinations have the highest search volumes and optimize your page around those. It could mean the different between targeting 1,000 searchers a month to targeting 100,000 searcher a month.
Duplicate content is treated as spam by search engines because they see it as taking two copies of the same page and trying to make them look like different pages so that you can (unfairly and inaccurately) increase your rankings on more sets of keywords. Duplicate content is a problem for dynamic publishers because, often, the same page can be called up through different criteria, such as categories and tags.
Dynamic publishing sites post two challenges for publishers. First, they confuse search engines so that they’re not sure which duplicated page to include in their index. Second, search engines might end up seeing it as spam, and penalizing or outright banning your site from the SERPs altogether.
On major publishing sites with multiple categories and tags (especially blogs), duplicate content can be common in three different places:
Index Page: if your index page features some latest or featured article/posts (like on a blog), then you’ll probably have some content overlap between your index page and other category pages.Categories/Tags: if you articles/posts can appear in multiple categories/tags, you will most certainly have duplicate content issues across several category/tags pages.Article/Post Pages: if your index, blog, category, and/or pages features the articles/posts in full, then you’ll have duplicate content problems between those articles/posts and the other places they appear in full on your site.
There are five steps you can take to ensure that duplicate content issues do not affect your site’s indexation of rankings.
First of all, the only time where an artice/post should appear in its entirety is on the actual article/pst page. Any other page that might list that article (e.g. blog page, index page, category page, or tag page), shoud feature only a teaser from that article.
Ideally, your teasers should be completely unique, and not appear in the article itself. However, many large publishing sites choose to just feature the first 300 or so characters from the article, and it doesn’t seem to hurt their rankings.
As mentioned above, give every page a unique and targeted page title and meta description. This set of information is the first thing that search engines look at to determine what a page is all about, so this is your first opportunity to let search engines know exactly why any page is unique from another. This will be particularly important for category and tag pages, where content can be duplicated several times over.
The next place to let search engines know that a page is unique is through its content. So while two category/tag pages might list many of the same articles/posts/links, you can intervene by giving every page some unique static content that appears at the top.
There are two ways you should do this: (1) with unique H1 tag (hint: keep it related to your title tag), and (2) an additional descriptive paragraph that appears between that H1 tag and the content feed that may be duplicating content from other areas of your site.
If your site produce a lot of content across many categories and tags, it is probably not feasiable to produce unique page titles, meta descriptions, H1 tags, and intro paragraphs for every page — especially in the case of tags, which tend to be added on an ad hoc, ongoing basis. In this case, you will want to just tell search engines to ignore the really redundant pages and you can do this in two ways:
by adding to the page source of each redundant pageby adding these redundant pages to your robots.txt file
Of course, this begs the question of how do you know what pages are redundant? Well generally, it is best to exclude pages that are there for usability or navigation, but not search engines. As a rule, these tend to include:
tag pages (but keep category pages),author pages/feeds (unless you have high profile authors you want to rank for),archive by date page
Finally, many enterprise level publishers tend to syndicate content across many sites. This can results in another kind of duplicate content.
So what do you do if you wanna share relevant content with your users but not get penalized as spammer? Well, the answer is simple: use a canonical tag.
By adding a tag to the page source of the page, you can tell the search engines “This content is duplicated and the original version is over there.” This way, the search engines know that you have nothing to hide. The canonical tag will look something like this:
Whether we’re talking about business or something else, size is something that can work either for or against you. For instance, large companies have more resources, but are slower to adapt to changes in the marketplace. From an SEO perspective, enterprise level publishers have more content to be indexed, attract links, and help them rank on more terms, but the bigger the sitemap, the easier it is for search engines to get lost.
The right blend of onsite SEO, however, can make a huge difference when it comes to ranking on more terms and not getting in the process. What it really comes down to is ensuring that each page is as unique as possible (think page title, meta description, and H1 tag), and that there’s as little content duplicated in different places as possible. By taking these steps, not only can enterprise level publishers avoid penalties, but over time they will rank on longer and longer tail terms, and see an incredible amount of their organic traffic coming through on older pieces of content.
…………………………………………………………………………………………………………………
CT Moore is a strategist the SEO services company NVI, which specializes in SEO, PPC, and Social Media campaigns. He is also an accomplished writer, blogger, and speaker, as well as a Staff Editor at Revenews.com