And finally, the other really important bucket is authority. Google wants to show sites that are popular. If they can show the most popular t-shirt seller to people looking to buy t-shirts online, that’s the site they want to show. So you have to convince Google - send them signals that your site is the most popular site for the kind of t-shirts that you sell. Fill this bucket by building a fan base. Build a social network, get people to link to you, get people to share your t-shirt pages on their social network saying ‘I want this!’, get people to comment, leave testimonials, show pictures of themselves wearing the product or using the product, Create a fan-base and then rally them to link to you and talk about you. That’s how you prove to Google that you are trustworthy and authoritative.
In 2007, Google announced a campaign against paid links that transfer PageRank.[29] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[30] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript.[31]
Your site’s URL structure can be important both from a tracking perspective (you can more easily segment data in reports using a segmented, logical URL structure), and a shareability standpoint (shorter, descriptive URLs are easier to copy and paste and tend to get mistakenly cut off less frequently). Again: don’t work to cram in as many keywords as possible; create a short, descriptive URL.
Essentially, what distinguishes direct from organic traffic today is tracking. According to Business2Community, direct traffic is composed of website visits which have “no referring source or tracking information.” A referring source can be a search engine, or it can be a link from another website. Direct traffic can include visits that result from typing the URL directly into a browser, as the simple definition suggests.
3. General on-site optimization. On-site optimization is a collection of tactics, most of which are simple to implement, geared toward making your website more visible and indexable to search engines. These tactics include things like optimizing your titles and meta descriptions to include some of your target keywords, ensuring your site’s code is clean and minimal, and providing ample, relevant content on every page. I’ve got a huge list of on-site SEO tactics you can check out here.
I would like to talk about a case study for a large start up I worked on for over eight months in the Australian and US market. This client originally came to the company with the typical link building and SEO problems. They had been using a SEO company that had an extensive link network and was using less than impressive SEO tactics and methodologies over the last 12 months. The company was also losing considerable revenue as a direct result of this low quality SEO work. So, I had to scramble and develop a revival strategy for this client.
Organic traffic, on the other hand, are those visits which are tracked by another entity — usually because they have arrived through search engines — but also from other sources. Hubspot’s definition emphasizes the term “non-paid visits,” because paid search ads are considered a category of their own. But this is where the lines between direct and organic start to get little blurry.
SKU: TBWTO Category: Website Traffic Tags: Buy Keyword Traffic, Buy Organic Search Traffic, Buy Organic Traffic, Buy Organic Web Traffic, Buy Search Traffic, Cheap Organic Search Traffic, Cheap Organic Traffic, Cheap Search Traffic, Get Organic Search Traffic, Get Organic Traffic, Get Search Traffic, Keyword Targeted Traffic, Keyword Traffic, Organic Search Traffic, Organic Search Traffic Buy, Organic Traffic, Organic Traffic Buy, Purchase Organic Traffic, Quality Search Traffic, Search Traffic
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.

Thick & Unique Content – There is no magic number in terms of word count, and if you have a few pages of content on your site with a handful to a couple hundred words you won’t be falling out of Google’s good graces, but in general recent Panda updates in particular favor longer, unique content. If you have a large number (think thousands) of extremely short (50-200 words of content) pages or lots of duplicated content where nothing changes but the page’s title tag and say a line of text, that could get you in trouble. Look at the entirety of your site: are a large percentage of your pages thin, duplicated and low value? If so, try to identify a way to “thicken” those pages, or check your analytics to see how much traffic they’re getting, and simply exclude them (using a noindex meta tag) from search results to keep from having it appear to Google that you’re trying to flood their index with lots of low value pages in an attempt to have them rank.
How you mark up your images can impact not only the way that search engines perceive your page, but also how much search traffic from image search your site generates. An alt attribute is an HTML element that allows you to provide alternative information for an image if a user can’t view it. Your images may break over time (files get deleted, users have difficulty connecting to your site, etc.) so having a useful description of the image can be helpful from an overall usability perspective. This also gives you another opportunity – outside of your content – to help search engines understand what your page is about.
The SEO starter guide describes much of what your SEO will do for you. Although you don't need to know this guide well yourself if you're hiring a professional to do the work for you, it is useful to be familiar with these techniques, so that you can be aware if an SEO wants to use a technique that is not recommended or, worse, strongly discouraged.
What this means is that if someone visits a website and is logged into their Google account, the site owner cannot see the search keywords they used to get there. This has resulted in a great deal of organic traffic being incorrectly marked as direct. The same thing happened to Apple iOS 6 users carrying out Google searches through the Safari browser, after the operating system’s privacy settings were changed, as Search Engine Land reports.
Great post Ross but I have a question on scaling the work that goes into producing the Kob score: how do you recommend you go about getting the MOZ difficulty score – do you do it manually then VLOOKUP everything or some other way? My current membership at MOZ allows 750 searches a day for KW difficulty so this can be a limiting factor in this research. Would you agree?
Mobile traffic: In the Groupon experiment mentioned above, Groupon found that both browser and device matter in web analytics’ ability to track organic traffic. Although desktops using common browsers saw a smaller impact from the test (10-20 percent), mobile devices saw a 50 percent drop in direct traffic when the site was de-indexed. In short, as mobile users grow, we are likely to see direct traffic rise even more from organic search traffic.
“To give you an example, our domain authority is currently a mediocre 41 due to not putting a lot of emphasis on it in the past. For that reason, we want to (almost) automatically scratch off any keyword with a difficulty higher than 70%—we just can’t rank today. Even the 60% range as a starting point is gutsy, but it’s achievable if the content is good enough.”
To find the right people I downloaded a list of some of the most popular users within the community. To do this, I used Screaming Frog SEO Spider to gather a list of all the URLs on the website. I then exported this list into an Excel spreadsheet and filtered the URLs to only show those that were user profile pages. I could do this because all of the profile pages had /user/ within the URL.

Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.


Hi SEO 4 Attorneys, it could be any thing is this for your site for a clients site.It could be an attempt at negative SEO from a competitor? The thing is people may try to push 100's of spammy links to a site in hopes to knock it down. In the end of the day my best advice is to monitor your link profile on a weekly basis. Try to remove negative links where possible if you cant remove them then opt for the disavow tool as a last resort. 
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[21] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Earlier in the comment stream, there was a brief discussion about page load time/website speed and its effect on page ranking. I have tried to find unbiased information about which hosting company to use when starting a blog or a small WordPress sites, keeping in mind the importance of speed. This endeavor has been harder than expected as most hosting review sites have some kind of affiliate relationship with the hosting companies they review.
What I wonder is, is there any chance for a commercial website to win a featured snippet? Whenever I googled something and see featured snippets, it was always non-commercial sites. Is it just because most commercial sites are lack of information for featured snippets or just because Google doesn’t want to post commercial sites in features snippet?
Not all web traffic is welcomed. Some companies offer advertising schemes that, in return for increased web traffic (visitors), pay for screen space on the site. There is also "fake traffic", which is bot traffic generated by a third party. This type of traffic can damage a website's reputation, its visibility on Google, and overall domain authority.[citation needed]

The term “organic traffic” is used for referring to the visitors that land on your website as a result of unpaid (“organic”) search results. Organic traffic is the opposite of paid traffic, which defines the visits generated by paid ads. Visitors who are considered organic find your website after using a search engine like Google or Bing, so they are not “referred” by any other website.
×