I often start reading them on the train to work and finish on the way home – or in chunks. You’re one of the few pages out there which reads beautifully on a mobile device (again – because you’ve purposely created them that way). I usually always prefer reading longer form articles or more specifically how-to type guides on a desktop as the embedded information is almost always a pain on a mobile device but you definitely buck the trend there.
One common scam is the creation of "shadow" domains that funnel users to a site by using deceptive redirects. These shadow domains often will be owned by the SEO who claims to be working on a client's behalf. However, if the relationship sours, the SEO may point the domain to a different site, or even to a competitor's domain. If that happens, the client has paid to develop a competing site owned entirely by the SEO.

Understanding competitor web traffic can give you ideas for your own digital marketing campaigns, allow you to see opportunities to attract new visitors, or even show you what channels you may or may not be using effectively. Even if you’re at the top of your industry, staying there can be a challenge, so keeping a tab on competitive efforts can help you stay ahead of the curve.


“Sharability” – Not every single piece of content on your site will be linked to and shared hundreds of times. But in the same way you want to be careful of not rolling out large quantities of pages that have thin content, you want to consider who would be likely to share and link to new pages you’re creating on your site before you roll them out. Having large quantities of pages that aren’t likely to be shared or linked to doesn’t position those pages to rank well in search results, and doesn’t help to create a good picture of your site as a whole for search engines, either.
The Google, Yahoo!, and Bing search engines insert advertising on their search results pages. The ads are designed to look similar to the search results, though different enough for readers to distinguish between ads and actual results. This is done with various differences in background, text, link colors, and/or placement on the page. However, the appearance of ads on all major search engines is so similar to genuine search results that a majority of search engine users cannot effectively distinguish between the two.[1]
Not sure exactly why, perhaps I used a number too big and since my page is about classifieds, it probably seemed too much to browse through 1500 ads, I assume? Somewhat like you would post 800 tips for better ranking? Don’t know, will try to change things a bit and see how it goes, but you really gave me some new suggestions to go for with this article. Thanks again 🙂

I would like to thank Ross for this AMAZING post. There are too many internet marketers out there struggling to get traffic. How many people out there with mind-blowing websites that the world NEEDS that will never get enough traffic to get their ideas out to the public? How many people stuck at 9 to 5’s struggling to make money online only because they just CAN’T GET TRAFFIC? This is an extremely thoughtful post. The world needs more people who would create an article like this that could help the struggling moms out there trying to make money online.

Essentially, what distinguishes direct from organic traffic today is tracking. According to Business2Community, direct traffic is composed of website visits which have “no referring source or tracking information.” A referring source can be a search engine, or it can be a link from another website. Direct traffic can include visits that result from typing the URL directly into a browser, as the simple definition suggests.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Hi, my name is Dimitrios and I am responsible for Crave Culinaire’s digital marketing. I would like to drive more traffic to Crave’s blog. Since Crave Culinaire is the only catering company who provides molecular cuisine, I thought about craving a blog post about that. The influencers in this niche have great success in utilizing recipes on their blogs. I will share some recipes of Brian Roland, owner and head chef of Crave Culinaire.

I’ve always been one to create great content, but now I see it may not necessarily be the right content. Can Share Triggers work for all niches including things like plumbing companies, computer repair, maybe even handy men that have a website for their business? I would say I’m estimating half the views a month as I should. Hopefully some of these strategies will help.

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[63] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[64] As of 2006, Google had an 85–90% market share in Germany.[65] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[65] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[66] That market share is achieved in a number of countries.


Thank you Brian, this is awesome! About publishing studies, how do you gather all this unique data? How did you get access to behind-the-scenes data from 1.3M videos to analyze? We recently published an infograpghic on a client’s blog but it’s just data we quoted from other sites, not unique. I wonder if you can get your own stats when you have a small site.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Earlier in the comment stream, there was a brief discussion about page load time/website speed and its effect on page ranking. I have tried to find unbiased information about which hosting company to use when starting a blog or a small WordPress sites, keeping in mind the importance of speed. This endeavor has been harder than expected as most hosting review sites have some kind of affiliate relationship with the hosting companies they review.
Social media has a pivotal role – Last but not least, social media is an evolving platform that has changed from a basic communication platform to a highly profitable marketing channel. Many users start their searches on social media and make their way to a business’s site. Sharing up-to-date, engaging, and personalized content will attract more people to your profile, and eventually to your website.

Beyond organic and direct traffic, you must understand the difference between all of your traffic sources and how traffic is classified. Most web analytics platforms, like Google Analytics, utilize an algorithm and flow chart based on the referring website or parameters set within the URL that determine the source of traffic. Here is a breakdown of all sources:

1- What if my site is new and I just start publishing blog posts, how can I gain the rank on the first page? You think that following the steps mentioned in the article would be enough? How to move my blogs from 0 to the first page? So in a nutshell, in a world where the competition is really tough in some keywords and gaining backlinks is a real challenge, how to improve my authority from zero?

SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is practice of designing, running and optimizing search engine ad campaigns.[55] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[56] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[57] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[58] which revealed a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analysed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device [59]. Google has been one of the companies that is utilising the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.
Great article, learned a lot from it! But I still really get it with the share trigger and right content. For instance, the influencers now care a lot about the new Koenigsegg Agera RS >> https://koenigsegg.com/blog/ (Car). I thought about an article like “10 things you need to know about the Koenigsegg Agera RS”. The only problem is that I don’t know which keywords I should use and how i can put in share triggers.
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
×