14 "No-BS" Ways to Increase Organic SEO Traffic (with Case Studies)

#1: Eliminate "organic anchors" with a data-driven content audit Lots of low-quality pages = bad news for SEO.Why?Because they weigh down the rest of your website. This causes (better) pages to underperform in the SERPs. The solution? Pruning .In simple terms, pruning involves auditing and removing “dead weight” content from your site. I.e. any pages that have ZERO links, ZERO traffic and ZERO conversions (and/or contain irrelevant/tin content) are prime candidates to be deleted. These types of pages offer nothing of value to your site, and are actually weighing down other important assets by eating up precious crawl budget (meaning new or updated content gets crawled less often). Note: Thereare some outliers in the pruning process. For example, if you have an important resource on your site that gets little traffic or inbound links, but does get a lot of internal links, you might want to still keep it. Here’s an example: At the start of 2017, my agency started working with one of the nation’s leading defamation attorneys. When he came to us he was getting around 3,500 organic visits and 130 new leads a month from organic traffic. Not too bad. But, he wanted to do better. So - the first thing we did was follow the content audit process outlined below: At a high level: He had two different websites competing for the same keywords, so we consolidated the two sites and merged all competing assets. (More on this in tactic #2 “”) He had dozens of the “dead weight” pages we discussed above, so we removed those pages from the site He had a bunch of pages on the site already ranking page 2 for valuable search terms, so we improved those assets (More on this in TACTIC #4 “”) Here are the results 10 months later: He’s now getting over 20,000 organic visits and 400 leads a month . In fact, business is so good he has now started his own firm. Note: These results were achieved writing very little new content, and building only a handful of new backlinks . Here’s an overview of my decision-making process during a content audit: During the audit phase, we'll make one of four page-level strategic recommendations: KEEP content that is relevant and getting a lot of traffic and conversions. IMPROVE content with the potential to either get more traffic (tactic #4), or more conversion from the existing traffic. MERGE content with backlinks that is competing for the same keywords as another higher ranking piece of content on the site. REMOVE content with no links, traffic or conversions. Now:For all the visual learners out there, think of this process in terms of an iceberg analogy (hat tip to Everett Sizemore over at GoInflow): Pages appearing above the water line are top-performers (keep these!), whereas those just below the water line have potential, but need some improvements - updated content, re-promotion , conversion optimization etc - to reach their “full potential”.Any pages deep down at the bottom of the iceberg are the ones you need to get rid of—they’re generally low-quality assets providing no added value, and weighing down the rest of your website. Ok:So, how do you identify which pages to keep, improve, merge or remove? Follow this workflow: Note : there are a few different ways you can pull the data for this audit. I am going to focus on one that allows you to scale the process relatively quickly, and avoid having to use a lot of paid tools. You can grab a copy of the content audit template used in the example below: First, you need to check whether or not the pages/posts on your website have any inbound links. You can use URL Profiler and Ahrefs to quickly scale this part of the process.Here’s how:Copy the URL of your sitemap ( hint : this is usually found at yourdomain.com/sitemap.xml or yourdomain.com/post-sitemap.xml )In URL Profiler, right-click on the URL list area and hit “Import from XML sitemap”: You’ll then be prompted to paste your sitemap URL in the box: Hit “Import” and URL Profiler will automatically pull in every URL it finds in the sitemap. Note : If you have more than one sitemap (e.g. pages and posts sitemaps), you’ll need to repeat this process to pull every page/post into URL Profiler :OR: Use a tool like Screaming Frog to extract all the indexed content on your site with a single crawl: The free version of Screaming Frog will allow to crawl 500 URLs. But, you will have limited configuration options. Here are the basic settings I use when collecting the indexed content from a website: And the "Advanced" settings: You’ll be left with a list of the URLs that can be crawled and indexed by search spiders like Google Bot.Regardless of which approach you take above (sitemap extraction or Screaming Frog crawl), the next step is to export all the URLs and paste them into URL Profiler: Next, connect your Ahrefs account (instructions on doing so can be found here ) to URL Profiler. Check the Ahrefs box (under URL level data), then hit “Run profiler”. Within a few minutes (depending on the number of URLs), URL Profiler will spit out a spreadsheet that looks something like this: This includes a lot of data, including the number of referring domains pointing to each page/post on your website.Copy/paste all the data from this spreadsheet and paste it into the sheet labelled “URL Profiler” in this Google Sheet: OK, so now you know how many inbound links (if any) are pointing to each page on your website—the next step is to check which of these pages actually have traffic/conversions in Google Analytics.Go to Google Analytics > Customisation > Custom Reports > New Custom Report: Set up your custom report so it matches the screenshot below (note: I’ve highlighted the super-important parts!) Hit “Save” and view the report—it should look something like this: Note : I recommended setting the date range for the report to the last 3 months.Export the report—just make sure to set the number of visible rows to the maximum amount (5000) first: Copy/paste all data from the exported .csv into the sheet labelled “2. GA Export” in the Google Sheet : Finally, navigate to the “DONE” sheet and you should see something like this: This compiles all the data and gives a recommended action (e.g. “keep”, “consolidate”, etc.) Note : This recommended action doesn’t take into account the relevancy of the page, so you will need to double-check that manually before making a final decision BONUS : This is a modified version of my content audit process. If you want to get access to exact processes, templates and tools we use at our agency, I’m including a “playbook” in my new course. You can find out more about it here . #2: Prevent Your Website from Competing with Itself by Identifying (and Removing) Keyword Cannibalization “Keyword cannibalization” occurs when two or more pages on your website are competing for the same keyword.Here’s why this is such a BIG problem: Google will struggle to figure out which one of your pages actually deserves to rank, so they’ll often choose to rank neither of them. Links/shares/etc will be split between two or more pages, leading to less authority for each page (this is bad, as pages with higher authority tend to rank better). To put it simply, because your website is effectively competing with itself , you’re significantly diluting your chances of ranking at all!Keyword cannibalization should, therefore, be avoided at all costs! This process captures the “ MERGE ” aspect of the content audit covered above in greater detail. Here’s how you can identify (and fix) keyword cannibalization issues in 3 simple steps: Use SEMrush to see which keywords your website is ranking for Look for keyword duplication (i.e. multiple pages ranking for the same keyword) Solve the issue by either merging the two (or more) resources together, OR deleting/404 one of them (note: only do this if there are ZERO links/traffic to that page!) Example: One of my clients had an article targeting the search term "marketing technology stack" that suddenly fell from position #4 in Google, to page #4. At first, the client thought it might have been some kind of page-level algorithmic penalty. After running the process outline below we found 5 different articles competing for the same keyword. Each competing article had links pointing to it. So, instead of spreading the link equity across 5 different pages we took all unique content from the lower ranking pages and merged it into the canonical (highest ranking version), and then 301 redirected all the other posts into it to consolidate the link equity. The page became a much more in-depth authoritative resource on the subject, and got added authority from the links that were 301'd from the other articles.The result: The page has gone from ~200 organic pageviews to almost 1,000 /mo. And, it now ranks #1 for it’s target search term: This is without writing any new content or building any new links. Now, imagine what happens when you scale this process across websites with lots of competing articles :) Let's walk through the process:To begin, enter your domain (e.g. robbierichards.com) into the Organic Keywords report in SEMrush , then select “Positions” from the sidebar: This will show you every keyword your website is ranking for. It also tells you which page ranks for each keyword and the position in which it ranks: Export this entire report to a .csv: Next, copy/paste all the exported data into the sheet named “ 1. SEMRush Export ” in this Google Sheet. It should look something like this: Finally, navigate to the “DONE” tab and it will show you all keyword cannibalization issues on your website. Cool, right!? :D Here’s are a couple of ways to solve these issues: If the two pages competing for the same keyword are very similar, and both offer unique value, consider merging them into one canonical resource. Just make sure to 301 redirect one of the pages to the new canonical resource (especially if it has links pointing towards it!) If the competing page offers nothing of unique value, delete it. If the deleted page has links pointing towards it (check this in Ahrefs ), add a 301 redirect to the competing resource, otherwise just let it 404. #3: Uncover Low-hanging Ranking Opportunities by Performing Keyword Research for an Existing Website Keyword research only needs to be done when you’re starting a new website, right!?NOPE!This couldn’t be more WRONG.Improving rankings for keywords you’re already ranking for is the quickest and easiest way to get a TON more traffic to your website. Want proof?I increased organic traffic 402% to this post 30 days after implementing this strategy: It went from position #8 to #2 overnight , which is why the traffic shot up like a rocket!And this was after optimizing ONE page...if you were to do this across your entire website, traffic would go through the roof! Here’s how to do it: Identify “low-hanging” keyword opportunities (i.e. those that you’re already ranking for on page 2 of the SERPs, OR low down on page 1) Optimize the pages and relaunch for MASSIVE traffic boosts Here’s the process:Go to SEMrush , enter your website, then go to the Positions report: This will show you EVERY keyword you’re ranking for, along with the ranking position.BUT, we’re not interested in every keyword—we want to focus on the ones with the most potential. To do that, apply these filters to the report: Note : Set the search volume threshold to something that makes sense for your industry. i.e.You may need to lower it a bit more to find more opportunities. Export the results to a .csv, then copy/paste the data into the sheet labelled “1. SEMRush Export” in this Google Sheet .Now, go to the next tab labelled “DONE” and you should see something like this: All of these keywords are low-hanging opportunities, but the rows that are the most green are the opportunities that are likely to yield the BEST results with the LEAST amount of effort.After you've found keywords that have (1) search volume, (2) existing rankings, and (3) can be realistically ranked for in the next 60-90 days, you need to prioritize.When I do this final part of the process, I always rely on a bottom-up view of the funnel. (i.e. start with the "money" keywords at the bottom of the funnel , and work my back up to the top): Here is a quick overview of how I would optimize these posts to move up the rankings: Update existing tactics with new screenshots and additional information Add 3-5 new strategies to the post Re-promote the post across social media Run a paid social media campaign to build social signals Launch a light outreach campaign to capture additional backlinks Add internal links from several other related posts on the site To find the best internal linking targets, navigate to the "Best By Links" report in Ahrefs and filter by either URL Rating or Referring Domains . This will surface the most authoritative pages on your site: For example:Since this post is about increasing organic traffic, you bet I'm going to add a few internal links from this post and this post . Note : I have a full post dedicated to this strategy here . But, if you want to go deeper into specific keyword research and relaunch tactics I use for clients, check out my SEO training course here . #4: Perform Keyword Research at the Sub Folder Level (and Find Your Highest Value Targets) Not all keywords are created equal. A site that monetizes through AdSense revenue will prioritize high volume informational intent keywords to drive more ad impressions and clicks: Check out the search volume and traffic numbers for this article: An affiliate website like Wirecutter will prioritize investigational intent keywords searched when people are evaluating different solution for a specific problem or need. Think : “best tool for x”, “product x vs product y” or “product x alternatives” : Ecommerce stores will prioritize transactional terms since these have the highest degree of direct buying intent.Because different business models will prioritize different search intent, it makes sense to mirror this during the keyword research process to ensure you are only focusing on the search terms that have greatest potential bottom line impact.This applies for both new and existing websites. One of the easiest ways to perform this type of laser-targeted research (for both new and existing sites) is to analyze keywords at the subfolder level . Important : this strategy works when your site has a clean URL structure with content types organized into dedicated subfolders. Here’s how to do it: 1) Find your highest value existing keyword opportunities In the previous step we looked at how to find ALL the quick-win keyword opportunities for your site across ALL intent buckets. This approach works great if you only publish content with a single form of intent. For example: a blog that only publishes high volume informational content to monetize through adsense revenue.But:If you have an eCommerce site, you’re going to be publishing informational, investigational AND transactional intent keywords across assets like blog content, comparison pages and top-level product/category pages. While each level of intent serves a specific purpose for this business, its the investigational and transactional keywords that drive direct bottom line value. Therefore:It makes sense to prioritize the keyword research process around commercial intent terms. And, this is where subfolders come into play. For example:Most eCommerce sites are going to house products under some type of top-level URL subfolder: So, instead of starting the keyword research process by looking at ALL the keyword opportunities (like we did in the previous strategy), start with the keyword opportunities inside commercial intent subfolder(s). Here’s how to do it for an existing site: Open SEMrush and enter your domain into the SEO Toolkit: Go to the Organic Research >> Positions report. You’ll once again see all the keywords your site is ranking for in the top 100 search results: Since you’re only interested in the existing keywords with commercial intent, add an additional filter to only include keywords targeted within the /collections subfolder: Click “Apply” and you’ll have a list of all the commercial intent keywords your site is currently ranking for bottom of page 1, or top of page 2: These are the most valuable quick-win keyword opportunities for an eCommerce site like Beardbrand, and should always be prioritized when it comes to updating and relaunching content for organic traffic gains. But:Once Beardbrand was done with the commercial intent terms, they could then move onto the higher volume informational terms by analyzing the keyword opportunities housed inside the /blog subfolder: Which would return hundreds of high volume informational keyword opportunities: Bottom line: use subfolder to focus in on your highest intent terms, when possible. 2) Mine competitors for new high value keyword opportunities Subfolder research can also be used to find NEW high value keyword opportunities. The process is exactly the same, but instead of entering your site you’ll drop in the commercial intent subfolders from 3-5 direct competitors:For example:Scotch Porter is one of beardbrand’s top organic search competitors. All their products are housed under the /products subfolder: On the other hand, a competitor such as Beardaholic has all their products housed on the shop.beardaholic subdomain: So:Beardbrand would set the following filters to see all the commercial intent keywords their site is ranking for: Repeat this process for 3-5 of your top organic search competitors and export the results into a aggregated master excel file.Set a filter to highlight all the duplicates, and focus in on the new commercial intent opportunities you are not already targeting on your site. This is one of the fastest ways to find loads of high-value keyword ideas that align directly with your site’s monetization model. Top #5: Expand Your Organic Footprint with Secondary Keywords One of the fastest ways to increase organic traffic is to get your content to rank for more keywords. Wirecutter is a site that reviews a bunch of tools and gadgets. It ranks for over 3.1M organic keywords and brings in 4.4M organic visits a month: The founder, Brian Lam, sold the site to The New York Times for $30M in 2016: One of the reasons this site was able to scale its organic footprint so much was because most of the articles on the site ranked for thousands of secondary keywords. For example: The site’s highest organic traffic page on “best cell phone plans” ranks for 42,478 different keywords: Note : The primary keyword “best cell phone plans” (48,000 monthly searches) only brings in 5% of the page’s overall monthly organic traffic. The rest comes from the other 42,477 semantic and long tail secondary keywords. Here is another article from Digital Trends targeting the topic “best laptop backpack for travellers” : The post ranks for 4,800 different keywords and rakes in 5,800 organic searches a month. But:Similar to the Wirecutter example above, its top 5 keywords only account for ~20% of the total organic traffic: Rather than target the same keyword repeatedly, Digital Trends has sprinkled secondary keywords - slight variations, re-wordings, or alternate ways to say the same thing - throughout the post: Using long tails and semantics helps Google see the post as being relevant to a range of queries.Ok: So, how do you expand the organic footprint of your content? The first step in the process is finding a list of secondary keyword targets with some kind of search volume. Here are a couple quick ways to do this: 1. Mine competitor articles Go to the Keyword Explorer tool and enter primary topic for one of your existing articles. Scroll down to the SERP Overview report to see all the top ranking articles along with a number of metrics, including Domain Rating, # of Backlinks, Traffic, # of Keywords etc: Click the “Kw” link to view all the keywords the page/post is ranking for (and getting traffic from): Here is another example from the coffee products niche: If I was going to equip myself with affiliate links and write a post comparing coffee grinders, I’d know to include sections on burr grinders, grinders for french press, conical grinders, etc . Without mentioning these kinds of grinders in the post, I’d be missing out on thousands of extra visits when the post started to rank.Before you hit publish (or ideally before you start writing), run the top 10 competing posts through Ahrefs and add their best long-tail keywords to your list.Export all the secondary keywords and remove any duplicates. Note : This strategy is not only good for finding secondary keywords to include in on-page elements such as a title tags, headings and body copy, but they can also give you ideas for new sections or topics to cover in the content too. 2) Perform URL-level content gap analysis If I enter “how to increase organic traffic” into the Ahrefs Keyword Explorer I can see my article is only ranking for 67 different keywords, compared to the hundreds of different terms many of the competing article are ranking for: So, how do you quickly identify all the keyword gaps? Here’s how to do it: Head over to the Site Explorer and enter the URL of your content: Navigate down to the Organic Search section and click the Content Gap Analysis link: Next - enter up to 10 competing articles: Note : make sure you have the URL selected from the drop down next to each URL since you are specifically interested in all the keywords those competing pages/ posts (not whole domain) are ranking, but you are not. You’ll have several filter options to choose from: I recommend keeping “ At least one of the targets should rank in top 10 ” selected as this will help return the most relevant results. Hit search and you’ll see a list of all the keywords the competing articles are ranking for, but your article is not: Bonus tip: as you scan down the list of secondary keyword opportunities, focus on the the opportunities where at least two of your competitors are ranking in the top 10 for a given keyword. These will typically be the most relevant targets. 2) Perform long tail (and semantic) keyword research Open Keyword Explorer >> Enter primary keyword >> View Keyword Ideas: Scan the list of keyword ideas and add any relevant terms with decent search volume to a dedicated list: Finally - export the keywords and remove any duplicates. How to quickly incorporate secondary keywords into your content You’ll probably find a handful of secondary keywords were naturally included in the article as it was written.But, it’s always good to revisit the content and make sure all the bases are covered. With your secondary keyword list in hand, make sure to include them in headings .Headings -- wrapped in h1, h2, and h3 tags -- are essential for signalling content relevance to Google. “We do use H tags to understand the structure of the text on a page better” - John Mueller, Google Looking back at the earlier coffee grinder example… You could derive the entire article structure - including headings - from that list alone: What is a grinder



Benzer Haberler