What I like to call, organic, and this is also known as free traffic. So, basically, for organic or free traffic, this is your promotion through social media. You can use Instagram. You know, you can use blogs, you can use forums. But, the main type of organic traffic that I prefer to use, is called SEO, which is Search Engine Optimization. So, for those of you who are not familiar with SEO, this is basically, if you were to search something up on Google, or you know, on YouTube, and a article or a video pops up, that right there is Search Engine Optimization. So, basically, if you’re ranking highly in the search results, so, you know, in the first page of Google, then that is massive amounts of organic traffic, depending on your keyword.
Here’s the thing: web traffic is basically the test of how much attention and noise your website is making throughout the whole virtual world. Thus, if you can generate a lot of website traffic, then what this basically means is that you’re getting a lot of attention (hopefully the positive kind, of course). But if you’re not generating that much traffic, what could you be doing wrong?
If a web page is not listed in the first pages of any search, the odds of someone finding it diminishes greatly (especially if there is other competition on the first page). Very few people go past the first page, and the percentage that go to subsequent pages is substantially lower. Consequently, getting proper placement on search engines, a practice known as SEO, is as important as the website itself..[citation needed]
6. Measurement and analysis. You won’t get far in SEO unless you know how to measure your results, interpret those results, and use your analysis to make meaningful changes to your approach. The best tool for the job is still Google Analytics, especially if you’re new to the game. Spend some time experimenting with different metrics and reports, and read up on Analytics knowledge base articles. There’s a deep world to dive into.

By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[25] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[26] Patents related to search engines can provide information to better understand search engines.[27] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[28]
Your site’s URL structure can be important both from a tracking perspective (you can more easily segment data in reports using a segmented, logical URL structure), and a shareability standpoint (shorter, descriptive URLs are easier to copy and paste and tend to get mistakenly cut off less frequently). Again: don’t work to cram in as many keywords as possible; create a short, descriptive URL.

Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Direct traffic is defined as visits with no referring website. When a visitor follows a link from one website to another, the site of origin is considered the referrer. These sites can be search engines, social media, blogs, or other websites that have links to other websites. Direct traffic categorizes visits that do not come from a referring URL.
Backlinks are basically Authoritative linking. Which means someone else says about your site that it is in an indication of a particular keyword or you have authority in a particular market is indicating that their readers can go and find more helpful information from certain places on the web and they do that by creating these authoritative links which also called backlinks. The more of high quality, authoritative links that you have, Google considers this as you are being incredible in the market. Your website can be authoritative by having other website owners to link to your website, Then Search Engine algorithm will consider your site and you will get higher boost to your SEO and your site will likely get higher ranking and the more of this authoritative link. Blog Commenting is a great way to get backlinks to your website. Step 1. Find relevant and high traffic blog in your niche. Step 2. Actually read the post, what all it’s about. Step 3. Just leave relevant comment to the topic, then simply place your link in the comment.

Thanks Brian. I’ve had a “a-ha” moment thanks to you! Great advice. I knew that backlinks would improve the organic SEO rankings to our client-targeted landing pages but I never knew it was through getting influencers to backlink blogs. I always just assumed it was great content that users wanted to share with others. It was driving me mad why people love my content but never share enough. Now I know!

Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.


Trust is another important bucket that you need to be aware of when you are trying to get your site to rank in Google. Google doesn’t want to show just any website to it’s searchers, it wants to show the best website to its searchers, and so it wants to show sites that are trustworthy. One thing Google has indicated it likes to do is penalize sites or stores or companies that consistently have poor reviews, so if you have many poor reviews, in time Google is going to figure out not to show your site in their rankings because Google doesn’t want to show those sites to their searchers. So prove to Google’s algorithm that you are trustworthy. Get other highly authoritative websites to link to you. Get newspaper articles, get industry links, get other trusted sites to link to you: partners, vendors, happy customers - get them to link to your website to show that you are highly credible and trustworthy.

Here’s the thing: web traffic is basically the test of how much attention and noise your website is making throughout the whole virtual world. Thus, if you can generate a lot of website traffic, then what this basically means is that you’re getting a lot of attention (hopefully the positive kind, of course). But if you’re not generating that much traffic, what could you be doing wrong?
For our client: We were lucky enough to remove most from the prior agency outreach, we also went directly to many webmasters in which we wanted to remove links. We did not use the Disavow tool as it was not around when we completed this link cleanup, but we all know it has been said that if you are going to use the Disavow Tool to use it with caution.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
11th point to me would be too look at your social media properties, work out how you can use them to assist your SEO strategy. I mean working on competitions via social channels to drive SEO benefit to your main site is great, working on re-doing your YouTube videos to assist the main site and also working on your content sharing strategy via these social sites back to the main site.
!function(e){function n(t){if(r[t])return r[t].exports;var i=r[t]={i:t,l:!1,exports:{}};return e[t].call(i.exports,i,i.exports,n),i.l=!0,i.exports}var t=window.webpackJsonp;window.webpackJsonp=function(n,r,o){for(var u,s,a=0,l=[];a1)for(var t=1;td)return!1;if(p>f)return!1;var e=window.require.hasModule("shared/browser")&&window.require("shared/browser");return!e||!e.opera}function s(){var e="";return"quora.com"==window.Q.subdomainSuffix&&(e+=[window.location.protocol,"//log.quora.com"].join("")),e+="/ajax/log_errors_3RD_PARTY_POST"}function a(){var e=o(h);h=[],0!==e.length&&c(s(),{revision:window.Q.revision,errors:JSON.stringify(e)})}var l=t("./third_party/tracekit.js"),c=t("./shared/basicrpc.js").rpc;l.remoteFetching=!1,l.collectWindowErrors=!0,l.report.subscribe(r);var f=10,d=window.Q&&window.Q.errorSamplingRate||1,h=[],p=0,m=i(a,1e3),w=window.console&&!(window.NODE_JS&&window.UNIT_TEST);n.report=function(e){try{w&&console.error(e.stack||e),l.report(e)}catch(e){}};var y=function(e,n,t){r({name:n,message:t,source:e,stack:l.computeStackTrace.ofCaller().stack||[]}),w&&console.error(t)};n.logJsError=y.bind(null,"js"),n.logMobileJsError=y.bind(null,"mobile_js")},"./shared/globals.js":function(e,n,t){var r=t("./shared/links.js");(window.Q=window.Q||{}).openUrl=function(e,n){var t=e.href;return r.linkClicked(t,n),window.open(t).opener=null,!1}},"./shared/links.js":function(e,n){var t=[];n.onLinkClick=function(e){t.push(e)},n.linkClicked=function(e,n){for(var r=0;r>>0;if("function"!=typeof e)throw new TypeError;for(arguments.length>1&&(t=n),r=0;r>>0,r=arguments.length>=2?arguments[1]:void 0,i=0;i>>0;if(0===i)return-1;var o=+n||0;if(Math.abs(o)===Infinity&&(o=0),o>=i)return-1;for(t=Math.max(o>=0?o:i-Math.abs(o),0);t>>0;if("function"!=typeof e)throw new TypeError(e+" is not a function");for(arguments.length>1&&(t=n),r=0;r>>0;if("function"!=typeof e)throw new TypeError(e+" is not a function");for(arguments.length>1&&(t=n),r=new Array(u),i=0;i>>0;if("function"!=typeof e)throw new TypeError;for(var r=[],i=arguments.length>=2?arguments[1]:void 0,o=0;o>>0,i=0;if(2==arguments.length)n=arguments[1];else{for(;i=r)throw new TypeError("Reduce of empty array with no initial value");n=t[i++]}for(;i>>0;if(0===i)return-1;for(n=i-1,arguments.length>1&&(n=Number(arguments[1]),n!=n?n=0:0!==n&&n!=1/0&&n!=-1/0&&(n=(n>0||-1)*Math.floor(Math.abs(n)))),t=n>=0?Math.min(n,i-1):i-Math.abs(n);t>=0;t--)if(t in r&&r[t]===e)return t;return-1};t(Array.prototype,"lastIndexOf",c)}if(!Array.prototype.includes){var f=function(e){"use strict";if(null==this)throw new TypeError("Array.prototype.includes called on null or undefined");var n=Object(this),t=parseInt(n.length,10)||0;if(0===t)return!1;var r,i=parseInt(arguments[1],10)||0;i>=0?r=i:(r=t+i)<0&&(r=0);for(var o;r
You have also mentioned Quuu for article sharing and driving traffic. I have been using Quuu for quite sometime now and I don’t think they’re worth it. While the content does get shared a lot, there are hardly any clicks to the site. Even the clicks that are there, average time is like 0.02 seconds compared to more than 2 minutes for other sources of traffic on my website. I have heard a few guys having a similar experience with Quuu and so, I thought should let you know.

Basically, what I’m talking about here is finding websites that have mentioned your brand name but they haven’t actually linked to you. For example, someone may have mentioned my name in an article they wrote (“Matthew Barby did this…”) but they didn’t link to matthewbarby.com. By checking for websites like this you can find quick opportunities to get them to add a link.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
If you are serious about improving search traffic and are unfamiliar with SEO, we recommend reading this guide front-to-back. We've tried to make it as concise as possible and easy to understand. There's a printable PDF version for those who'd prefer, and dozens of linked-to resources on other sites and pages that are also worthy of your attention.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
hey james - congrats on your success here. just a question about removing crummy links. for my own website, there are hundreds of thousands of backlinks in webmaster tools pointing to my site. The site has no penalties or anything  - the traffic seems to be growing every week. would you recommend hiring someone to go through the link profile anyway to remove crummy links that just occur naturally?
Great article, learned a lot from it! But I still really get it with the share trigger and right content. For instance, the influencers now care a lot about the new Koenigsegg Agera RS >> https://koenigsegg.com/blog/ (Car). I thought about an article like “10 things you need to know about the Koenigsegg Agera RS”. The only problem is that I don’t know which keywords I should use and how i can put in share triggers.
×