" "

Google doesn't always include a whole paragraph of text in the Featured Snippet. If you add "Step 1," "Step 2," "Step 3," etc. to the start of each HTML heading within your content (for example, within your H2 tags), Google will sometimes just list out your headings within the Featured Snippet. I've started to see this happen more and more in keywords beginning with "how to".
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Good point,The thing with this client is they wanted to mitigate the risk of removing a large number of links so high quality link building was moved in early before keyword research. So it is on a case by case basis, but defiantly a good point for most new clients I work with who do not have pre-existing issues you want to do Keyword Research very early in the process. 
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
Use the right anchor text. Using our previous example: if you wanted to internally link to the “how to make money” blog post, you can write a sentence in another blog, like “Once you have mastered [how to make money], you can enjoy as much luxury as you can dream.” In this case, the reader has a compelling case for clicking on the link because of both the anchor text (“how to make money”) and the context of the sentence. There is a clear benefit from clicking the link.

I've been using SE Ranking for tracking my progress in getting to the first page of Google for Qeryz for my target keywords. It's done a phenomenal job of keeping itself accurate - which sets it apart from all other rank tracking tools I've used in the past. That alone is reason enough for me to use and stay with SE Ranking amongst other things. Sean Si from Qeryz.com
Having large groups of content that all revolve around the same topic will build more relevance around keywords that you're trying to rank for within these topics, and it makes it much easier for Google to associate your content with specific topics. Not only that, but it makes it much easier to interlink between your content, pushing more internal links through your website.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.

Amber Kemmis is the VP of Client Services at SmartBug Media. Having a psychology background in the marketing world has its perks, especially with inbound marketing. My past studies in human behavior and psychology have led me to strongly believe that traditional ad marketing only turns prospects away, and advertising spend never puts the right message in front of the right person at the right time. Thus, resulting in wasted marketing efforts and investment. I'm determined to help each and every one of our clients attract and retain new customers in a delightful and helpful way that leads to sustainable revenue growth. Read more articles by Amber Kemmis.

Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
Wow Brian, You have solved my problem. A few days back I was looking for ways to increase traffic on my tech blog, I found this blog post by you while I was looking out for possible tricks to increase traffic. I must say that few of the tricks mentioned above really worked for me. For example, I updated a few old posts on my blog, I did try the broken link building technique and the last I did was to repost my content on Medium.
How much does it cost to bring in a visitor? Some web traffic is free, but many online stores rely on paid traffic — such as PPC or affiliates — to support and grow their business. Cost of Acquiring Customers (CAC) and Cost Per Acquisition (CPA) are arguably the two most important ecommerce metrics. When balanced with AOV (average order value) and CLV (customer lifetime value), a business can assess and adjust its ad spend as necessary.
Kristine Schachinger has 17 years digital experience including a focus on website design and implementation, accessibility standards and all aspects of website visibility involving SEO, social media and strategic planning. She additionally specializes in site health auditing, site forensics, technical SEO and site recovery planning especially when involving Google algorithms such as Penguin and Panda. Her seventeen years in design and development and eight years in online marketing give her a depth and breadth of understanding that comes from a broad exposure to not only digital marketing, but the complete product lifecycle along with the underlying technology and processes. She is a well known speaker, author and can be found on LinkedIn, Google+ and Twitter.

That’s true Thomas – this can happen when going after very competitive keywords. To avoid that you can just grab the first subpage you see ranking – subpages most of the time won’t have a lot of brand searches associated with them/you’ll see true topic value. It may be lower than normal, but in general can’t hurt to have a passive calculation when making arguments of what you might achieve.
#6 Go on podcasts! In 13 years of SEO and digital marketing, I’ve never had as much bang for the buck. You go on for 20 minutes, get access to a new audience and great natural links on high dwell time sites (hosts do all the work!). Thanks for including this tip Brian, I still don’t think the SEO community has caught on to the benefits of podcast guesting campaigns for SEO and more…it’s changed my business for sure.
Trust is another important bucket that you need to be aware of when you are trying to get your site to rank in Google. Google doesn’t want to show just any website to it’s searchers, it wants to show the best website to its searchers, and so it wants to show sites that are trustworthy. One thing Google has indicated it likes to do is penalize sites or stores or companies that consistently have poor reviews, so if you have many poor reviews, in time Google is going to figure out not to show your site in their rankings because Google doesn’t want to show those sites to their searchers. So prove to Google’s algorithm that you are trustworthy. Get other highly authoritative websites to link to you. Get newspaper articles, get industry links, get other trusted sites to link to you: partners, vendors, happy customers - get them to link to your website to show that you are highly credible and trustworthy.
8. Technical SEO. Technical SEO is one of the most intimidating portions of the SEO knowledge base, but it’s an essential one. Don’t let the name scare you; the most technical elements of SEO can be learned even if you don’t have any programming or website development experience. For example, you can easily learn how to update and replace your site’s robots.txt file, and with the help of an online template, you should be able to put together your sitemap efficiently.
I’ve always been a believer that hard work gets the best results, and in practice it always ends up being true. On the web it’s no different. If you want more organic traffic, you have to work for it. That means giving your best effort every time, going after opportunities your competitors have missed, being consistent, guest blogging strategically, and staying on Google’s good side.

Nice post. I was wondering if all this content of your strategy was been writien in blog of the site, or if you added to content in some other specific parts of the sites. I don't believe 100% in the strategy of reomoving links. If Google just penalize you taking into account your inbound likes, It would be so easy to attack your competitors just by buying dirty link packages targeting to their sites.


As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.

Now, take a look at your content. When writing good content, make sure that you do so in short paragraphs, first of all. (Attention span is everything, after all.) Also be sure to include lots of real-life examples, as these will allow your readers to relate with you more. More importantly, don’t forget to back your data with research, links, and other supporting details. Even if you’re just talking about website traffic, your readers will appreciate that you’ve read a lot about it, too.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
I’m not sure that’s natural link building/earning and I feel Google would have a problem with webmasters getting hundreds of links from different entities which were all identical? The websites embedding the images may not even know they are linking to you. Google in the past recommended these kinds of links are nofollow: https://searchengineland.com/googles-matt-cutts-i-recommend-nofollowing-links-on-widgets-169487
Awesome tips Brian. Always enjoy your posts. My question is, how can I boost traffic significantly if my keyword has pretty low search volume (around 100 monthly searches based on keyword planner)? I’ve been trying to expand my keyword list to include broader terms like “customer experience” but as you know that is super competitive. Do you have any suggestions for me? Thanks in advance.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[53] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[54]
1- What if my site is new and I just start publishing blog posts, how can I gain the rank on the first page? You think that following the steps mentioned in the article would be enough? How to move my blogs from 0 to the first page? So in a nutshell, in a world where the competition is really tough in some keywords and gaining backlinks is a real challenge, how to improve my authority from zero?

He started by finding an offer that resonated with and is relevant to his audience. In his case, his blog was dedicated to teaching people how to use a software called “Sublime Text.” He simply offered a license to the software for the giveaway. By doing this, not only did he increase the chances of success of his giveaway since his incentive was relevant, but he also ensured the quality of subscribers since they were actually people interested in his content. It’s easy to give people an iPad or an iPhone, but how relevant will they be to you at the end of the day?


The response rate here was huge because this is a mutually beneficial relationship. The bloggers get free products to use within their outfits (as well as more clothes for their wardrobe!) and I was able to drive traffic through to my site, get high-quality backlinks, a load of social media engagement and some high-end photography to use within my own content and on product pages.

My company has been working on a large link building project. We’ve already performed extensive keyword research and link analysis and now we’re considering executing an email outreach campaign. However, all the content we’ve created up until this point is geared more towards our target audience as opposed to the key influencers of our target audience. Do you think it would be worth it to try to build backlinks to our existing content or are we better off creating new content that directly appeals to the influencers of our target audience?

To find the right people I downloaded a list of some of the most popular users within the community. To do this, I used Screaming Frog SEO Spider to gather a list of all the URLs on the website. I then exported this list into an Excel spreadsheet and filtered the URLs to only show those that were user profile pages. I could do this because all of the profile pages had /user/ within the URL.
Go to local events or Meetup events and connect with bloggers in your industry. An example of an event I run to connect with bloggers and people in the online marketing word is: http://www.meetup.com/Online-Marketing-Sydney/. Make friends first and then try to gain guest posts later. I am not really a fan of websites which are flooded with guest posts one after another; it is the type of thing which Google is just waiting to target.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
I am too much late to commenting on this article. I want to read "How much get Organic traffic by SEO", found your your article on top & really very interesting. James Norquay, you did good research.I think Now days google block mostly SEO activities. Is this worthy for current marketing scnerio?If any other post to related strategy for increasing Organic traffic, you can reffer me.
Buy Organic Traffic or Organic Search Traffic and receive real human visitors to your website. Our Organic traffic comes from main search engines such as Google, Bing & Yahoo. You are entitled to target three keywords per campaign that they must be relevant to your website and match to your On-page SEO. If you only wish to receive traffic without targeting traffic, then please order Direct Traffic.

Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
For our client: We were lucky enough to remove most from the prior agency outreach, we also went directly to many webmasters in which we wanted to remove links. We did not use the Disavow tool as it was not around when we completed this link cleanup, but we all know it has been said that if you are going to use the Disavow Tool to use it with caution.
SEO.com will work with you now and for the future to provide all the online marketing services you may need to keep growing your business competitively. Since we offer a complete, compatible array of web related services you won’t need to hire, herd, or manage random outside or foreign firms, and take the many risks of mixing them in to your projects.
Wow Brian, You have solved my problem. A few days back I was looking for ways to increase traffic on my tech blog, I found this blog post by you while I was looking out for possible tricks to increase traffic. I must say that few of the tricks mentioned above really worked for me. For example, I updated a few old posts on my blog, I did try the broken link building technique and the last I did was to repost my content on Medium.
Thank you Brian for this great article! I enjoy reading it even though it took quite sometime of slow reading to sink all concept in and trying to remember them. For future reference, I also shared your article in my Facebook post so I can refer to and share with those who worked with me. I like the way you presented the details and its easy to read and understand.:)
To track website traffic you need analytics software. That anonymously logs every visitor to your site and keeps track of their actions. One of the most popular analytics apps is Google Analytics. It’s free, but there can be a learning curve in getting it set up correctly. If you don’t add your tracking code the right way, then you won’t be able to track your traffic. Consider getting an analytics expert to help you create the right setup for your website.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Now, some buckets are worth more than others, and the three main buckets that you need to be aware of for search rankings are quality, trust and authority. So quality: what Google is trying to measure when they’re trying to figure out what sites should rank is offering something valuable or unique or interesting to googles searchers. For example: good content - if you are selling t-shirts and you are using the same description that every other t-shirt seller is using on their website then you are not offering anything unique to Google’s searchers. Even though your t-shirts might look pretty cool, the content is the same as everybody else’s, so Google has no way of telling that your t-shirts or your t-shirt site is better than anybody else’s. Instead, offer people interesting content. For example: offer them the ability to personalize their t-shirt. Give them information on how to wash it. What’s the thread count? Is it stain resistant? Is this something you should wear in the summer or is it more heavy for winter? Give people information, or even be more creative. Get people to share pictures of themselves wearing the t-shirt. Create a community of people who are interested in your product. Get a famous person to wear it and share that picture online. Do something different, do something unique. Show Google that you are different and better than the other search results.
Good question, for most directories I use they ask for mobile number to send a message of verification, for the ones which phone you for verification inform the company before hand to tell their customer service people to be ready. I know the bigger the company the more tricky these things get you just have to find out what works best to answer the calls even if they give you a direct number to use. 

Search engines are smart, but they still need help. The major engines are always working to improve their technology to crawl the web more deeply and return better results to users. However, there is a limit to how search engines can operate. Whereas the right SEO can net you thousands of visitors and increased attention, the wrong moves can hide or bury your site deep in the search results where visibility is minimal.
1. The big picture. Before you get started with individual tricks and tactics, take a step back and learn about the “big picture” of SEO. The goal of SEO is to optimize your site so that it ranks higher in searches relevant to your industry; there are many ways to do this, but almost everything boils down to improving your relevance and authority. Your relevance is a measure of how appropriate your content is for an incoming query (and can be tweaked with keyword selection and content creation), and your authority is a measure of how trustworthy Google views your site to be (which can be improved with inbound links, brand mentions, high-quality content, and solid UI metrics).
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.

Hi Matt, realizing now how difficult it is to run a blog, trying to promote it and carry on with your daily activities. I would say it's a full time job. Once you thing you done learning about something, something else is coming :). My blog is about preparing for an ironman so I need to add the training on top of it. Thanks a lot for sharing this article with us so we can keep focus!!!
×