Internal Links Are Boring. They Also Tripled Our Organic Traffic
Ivaylo
February 26, 2026
Key Takeaways:
- Define 1-2 outcomes before you touch a single link.
- Join crawl, GSC, and a known-good URL list.
- Split work into hygiene, targets, and donors, then score 0-3.
- Add 2-5 contextual links per donor page, not link dumps.
Internal links are the work nobody wants because it looks like housekeeping. Then we watched an internal linking strategy turn a site that was stuck into one that finally got crawled properly, re-ranked key pages, and stopped bleeding authority into redirects and 404s. It was not glamorous. It was mostly spreadsheets, crawler exports, and fixing our own lazy habits.
We also botched the first pass. We celebrated after adding a bunch of links, then re-crawled and realized half of them pointed at redirected URLs, and two “priority” pages were still basically orphaned because the only links came from a paginated archive Google rarely reaches. That is what this guide is for: doing it like people who have made the mistakes.
Define the goal before touching any links
Pick 1 or 2 outcomes. Not five. If you try to “improve SEO” broadly, you will add random links everywhere, your navigation will get noisy, and you will not be able to prove what worked.
We usually choose from two outcomes:
First is rank lift for revenue or pipeline pages (product, services, “money” category pages). Second is faster discovery and stronger clustering for content that should rank but is buried or underlinked. That choice changes everything: which pages donate authority, which pages receive links, and what success looks like in Search Console.
Completion criteria for this step: you can point to a short list of “important pages” and say why they are important. Important can mean revenue, lead gen, subscription, or “this is the page that must rank for a core query.” Write it down. If you cannot explain it to someone in 20 seconds, you do not have it.
One warning that saves weeks: “Important” does not mean “we like it.” We have had teams spend a month linking to thought leadership pieces while the pricing page and the core service pages sat at 6 clicks deep.
If this goes wrong: you will start arguing about opinions. Recovery is simple: pick one metric that will settle it. If it is pipeline, use conversions or assisted conversions. If it is SEO growth, use impressions for a target query set and organic entrances.
Prerequisites and tooling that actually matter
You need three views of the site, because any single source lies by omission.
One: a crawler export. Use Moz Pro Site Crawl, Siteimprove SEO Intelligence, Semrush Site Audit, or Ahrefs Site Audit. We do not care which, as long as it gives you URL, status code, crawl depth, inlinks count, outlinks count, canonicals, and final destination for redirects. This is your “what exists and how it connects” file.
Two: performance and demand. Google Search Console for queries, impressions, clicks, average position by URL. Analytics for conversions and entrances if you have it. This is your “what matters and what has potential” file.
Three: a known-good URL list. A sitemap helps, but it is not enough. Pull your XML sitemap, plus a CMS export if possible (published URLs), plus any product or category feeds. Sitemaps can contain dead URLs, and they can also omit pages that are live. Both happen.
What trips people up: using only a sitemap or only GSC. The sitemap will miss true orphan pages that were created, then removed from navigation, then forgotten. GSC will miss URLs that have never been discovered, which is the whole point of internal linking.
Completion criteria: you have one spreadsheet or dataset that can be joined on URL with at least these columns: URL, depth, internal inlinks, status code, redirect target (if any), organic impressions, organic clicks, conversions (optional), and page type (blog, category, service, product, support).
If this goes wrong: URLs do not match across exports because of http vs https, trailing slashes, or parameters. Normalize. We have wasted afternoons because one export had `/page` and another had `/page/`. Pick a canonical format and clean everything to that.
Turn a messy crawl into a prioritized internal linking backlog
This is the part most guides skip, because it is annoying and it looks like project management. It is also where the traffic gains usually come from, because you stop treating every URL like it deserves the same care.
Start by separating “fixes” from “links we will add”
Do not mix broken link cleanup with authority routing with architecture changes in the same work queue. People do it, then they cannot ship anything because every task depends on another task.
We split into three queues:
Queue A is technical hygiene: internal 404s, internal redirects, redirect chains, canonical conflicts that break internal flows. This is mostly mechanical and should be fast.
Queue B is “pages to receive links” (targets): important pages that are too deep, underlinked, or orphaned, plus pages with organic potential that are stuck.
Queue C is “pages to link from” (donors): pages with backlinks, traffic, or prominent internal placement that can pass link equity distribution without becoming a link dump.
Build a simple scoring model you will actually use
You do not need a fancy model. You need a model that a tired team can apply consistently for a week.
We use a 0 to 3 scoring rubric per URL. Yes, it is subjective. That is fine. The value is that it forces a decision.
Business value (0-3): 3 for direct revenue or lead gen pages, 2 for pages that support conversion (comparisons, case studies), 1 for supporting content, 0 for utility pages like login.
Organic potential (0-3): 3 if it already gets impressions but low clicks or sits at positions 8-20, 2 if it targets a clear query and fits your topical map, 1 if it is vague or duplicate, 0 if it should not rank.
Depth penalty (0-3): 0 if depth is 3 clicks or less, 1 if depth is 4, 2 if depth is 5, 3 if depth is 6+. The exact thresholds are not holy. The goal is to force shallow site architecture SEO for pages that matter.
Inlink penalty (0-3): 0 if it has 5+ unique inlinking URLs, 1 if 3-4, 2 if 2, 3 if 0-1. Semrush calls out underlinked pages for a reason. A page with one incoming link is one template change away from becoming invisible.
Fix effort (S/M/L): S if you can fix it in a CMS field or one template, M if it needs content edits across several pages, L if it needs IA changes, redirects, or dev work.
Then compute a priority score like this: (Business value + Organic potential + Depth penalty + Inlink penalty) minus a small effort discount. We keep effort as a tiebreaker, not a gate. If a page is mission-critical, we do the hard thing.
Where this falls apart: teams treat the model as a math problem and stop thinking. We have seen a blog post with tons of impressions get scored high even though it was off-topic and could not convert. Keep a “sanity check” column: would you bet your own money this page should rank?
Output three ranked lists that a team can execute
Your spreadsheet should produce:
Top 20 pages to receive links: these are targets. Each needs more inlinks, better donors, or reduced depth.
Top 20 pages to link from: these are donors. Often your homepage, category hubs, high-traffic guides, and pages with real backlinks.
Top 20 technical fixes: internal 404s, internal redirects, redirect chains, and pages with 0 inlinks that should exist.
Completion criteria: each target page has an assigned “link goal” like “add 3 new unique inlinking URLs from relevant donors” and a depth goal like “move to 3 clicks or less.” If you cannot state the goal per page, it will become random linking busywork.
If this goes wrong: you end up with 200 “top priority” URLs. Recovery: cap it. We start with 20 targets because a team can actually ship that in 2 to 4 weeks. You can always do another sprint.
Architect an internal link structure that humans and crawlers can live with
Flat architecture does not mean shoving every page into the main menu. That is how you create a giant nav blob that users ignore and crawlers treat as noise.
We aim for a shallow depth target for important pages: 3 clicks or less from the homepage. Agent6 is unusually prescriptive about that, and while it is not a law of physics, it matches what we see in crawls. Deep pages get crawled less, get fewer internal links, and die quietly.
The practical pattern that works: hubs and clusters. Create a pillar or hub page that is the central resource for a topic. Then cluster pages support subtopics and link back to the pillar. Link laterally between clusters only when it is genuinely relevant.
URL hierarchy should match this structure. If your hub is `services/seo/`, then a cluster like local SEO should live at `services/seo/local/` or similar. Readable words, hyphens, and short paths. Avoid cryptic IDs and messy parameters for pages that must rank. This is not aesthetics. It is about keeping the site’s map legible to people and crawlers.
Completion criteria: you can draw the cluster on a whiteboard and point to the hub. If you cannot, your internal link structure is probably accidental.
If this goes wrong: someone insists on adding dozens of “helpful links” to the global footer. Stop. Put them in contextual sections or hub pages instead. Sitewide links are tempting because they are easy, but they blur topical relationships and often spread value thin.
Contextual linking that drives rankings (without making your content unreadable)
Contextual linking is where you tell Google what a page is about, not just where it sits in a menu. Navigational links are for wayfinding. Contextual links are for meaning.
We use a repeatable pattern:
When a page defines a concept, it links to the deeper guide. When a page compares options, it links to the relevant product or service page. When a high-traffic article answers a broad question, it links to the cluster pages that answer the sub-questions.
Anchor text is not a slot machine. Use natural phrases that match what the user expects after clicking. If you force-match exact keywords everywhere, the page reads like a bad affiliate site and it trains editors to hate the process.
The annoying part: it is easy to add links that are technically relevant but contextually weird. We have done this. You read the paragraph back and realize the link is jammed into a sentence where no human would click. Fix it by moving the link earlier, attaching it to the actual concept noun, and keeping it in the same intent. If a reader is learning “internal link structure,” do not send them to a pricing page mid-explanation.
Completion criteria: every target page gets links from multiple unique donor URLs, and at least some of those links are inside body content, not just nav or footer.
If this goes wrong: editors complain the article is turning into a Wikipedia clone. Recovery: cap yourself per section, not per page. There is no magic number of internal links per page, but there is a clear moment when the page becomes a link directory.
The mechanics of link equity distribution (and how to not create a link dump)
This is where most “pass authority” advice dies. People hear “link equity distribution,” then they add a massive related links block to every page and wonder why nothing improves.
Authority routing works when you treat internal links like a deliberate budget. You have a handful of pages that already earn attention: they have backlinks, they rank, they get traffic. Those are your donors. You then route some of that strength to pages that should rank but do not, using contextual links placed where the donor page is already topically aligned.
Find donor pages that can actually donate
Use Ahrefs or Semrush to find pages with backlinks. Use analytics and GSC to find pages with sustained organic entrances. Use your crawler to find pages with high internal inlink counts (often category hubs and navigation magnets).
Then sanity-check the donor pages manually. We have seen “high traffic” pages that are high because they rank for unrelated queries. They are bad donors because the intent mismatch is obvious.
A donor page should have:
Real relevance to the target topic, not just generic “marketing” adjacency.
A stable URL that will not be redirected next quarter.
A section where adding a link will feel natural, usually in an explanatory paragraph, a step list, or a “next to read” sentence that is actually useful.
Choose targets that deserve the help
Targets are not “pages we want to rank” in the abstract. Targets are pages with a clear query intent, a reason to exist, and a content quality level that will not embarrass you once they get more traffic.
We learned this the hard way: internal links can lift visibility, but they can also expose thin pages. You do not want to route authority into a page that is outdated, duplicative, or missing the answer.
Place links where equity and meaning both make sense
The placement matters more than people admit. A link buried in a generic “related posts” widget can get crawled, but it often carries less semantic clarity than a link embedded in a sentence that names the concept.
Our safeguard rules:
Donor pages add a small number of new contextual links per update, usually 2 to 5, inside relevant sections. Not sitewide. Not a 30-link appendix.
Targets should receive links from multiple unique donor URLs. One donor is fragile. One template change, one content refresh, and your target page is back in the dark.
We also avoid pointing donors at URLs that redirect. Redirect drag is real. It adds hops for users and crawlers and it is one of those silent “death by a thousand cuts” problems.
Anyway, the worst part of this work is that it is invisible until it suddenly is not. You will spend a day updating 50 links and your boss will ask what you did. You will say “we removed redirects.” You will get a blank stare.
Measure before and after, or you are guessing
Before you add links, record for each target page: GSC impressions, average position, clicks, crawl frequency if you can infer it from server logs or crawl timestamps, and internal inlink count from your crawler.
After you ship, wait. Internal linking changes can show up fast in crawl data but slower in rankings. Re-crawl within a week or two to confirm the links exist and are followable. Check GSC trends over 2 to 6 weeks depending on site size.
Completion criteria: you can show that target pages increased in internal inlinks, moved closer to 3-click depth if that was the goal, and have rising impressions or improving average position for relevant queries.
If this goes wrong: you see no movement and assume internal links do not work. Recovery: check the boring blockers. Are the donor pages actually crawled often? Are the links nofollowed by accident? Are the targets canonicalized somewhere else? Are the targets blocked by robots or stuck behind “hidden navigation” that crawlers cannot access? Moz calls this out because it is a real failure mode: you can have great content and still be invisible.
Internal link hygiene: fix the stuff that quietly breaks everything
This is the order we follow because it reduces compounding pain.
First, fix broken internal links that cause 404s. They are pure loss: bad UX and wasted authority signals.
Second, fix internal links that point to redirected URLs. Update them to the final destination. Semrush flags these because they are common, and because redirect chains happen when people “quick fix” instead of updating links.
Third, deal with orphan pages. If a page has 0 internal links pointing to it, it might not get indexed even if it is in the sitemap. We have watched sitemapped pages sit unindexed for months because nothing links to them.
Fourth, fix underlinked pages. A page with one incoming internal link is a hair away from becoming an orphan.
Fifth, address depth. If important pages are deeper than 3 clicks, reduce depth by linking from hubs, relevant category pages, and other high-level nodes. Do not brute-force it with mega-menus.
What to do when you cannot change templates: you still have options. Add links in body content on high-authority pages. Build a hub page that is accessible from the main navigation with one link, then use it as the structured gateway to the cluster. If the CMS is locked down, you can often edit a few “evergreen” pages that drive most of the internal graph.
If a page must stay orphaned: rare, but it happens (private campaigns, legal pages). Keep it out of the sitemap if it is truly not meant for discovery, or noindex it if it must exist but should not rank. Do not pretend this is an SEO win.
Completion criteria: your crawler shows zero internal 404 links, a sharply reduced count of internal links to redirects, and no orphan pages among your “important” set.
Breadcrumbs and navigational links as controlled structure signals
Breadcrumbs are not the main ranking lever, but they clean up a lot of confusion, for both users and crawlers.
Place breadcrumbs near the top of the page. Keep labels consistent and reflective of the real hierarchy, like “Home > Blog > Technical SEO > Internal Linking”. If your breadcrumb says one thing and your URL path and navigation say another, you are sending mixed signals and users feel it.
Add breadcrumb structured data if you can. It is one of those small details that can improve how pages appear in SERPs, and it forces you to be honest about hierarchy.
Completion criteria: breadcrumb trails match your actual category and hub structure across the site.
If this goes wrong: someone tries to use breadcrumbs as a second navigation bar stuffed with keywords. Stop. Breadcrumbs are for hierarchy, not for cramming links.
Internal linking automation and workflows that do not create spam
Internal linking is not a one-time project. If you treat it like one, the site will drift back into chaos as new pages ship.
Automation should suggest, not publish. Tools like Yoast SEO Premium’s internal linking suggestions can help editors find related content, but it will not understand your business priorities or the difference between a donor and a target.
We run a publish-time checklist inside the editorial workflow:
Before a page goes live, it must link to its hub or pillar page. Then it must include at least 2 contextual links to relevant cluster pages or supporting content, and it must receive at least 2 inbound links added from existing pages during the same sprint. This is the part everyone skips because it requires editing older content.
Quarterly, we re-crawl and rerun the backlog scoring. New orphans appear. Redirects creep back. Someone changes a URL slug and forgets to update internal links. It happens.
What nobody mentions: automation at scale can create sitewide irrelevant links if you let it. That damages UX and topical clarity. Put a human in the loop, even if it is a tired human doing a 10-minute review.
Completion criteria: every new important page ships with a defined place in the internal link structure and is not reliant on “maybe someone will link to it later.”
Verification: how to know you actually succeeded
Run a fresh crawl after shipping changes. Do not guess. Confirm:
No orphan pages among important URLs.
Priority URLs are at 3 clicks or less from the homepage, or you have a documented reason they cannot be.
Internal 404s are gone, and internal links pointing to redirects are minimized by updating to final URLs.
Underlinked priority pages have multiple unique inlinking URLs, not just a single token link.
Then validate in GSC: rising impressions and improved average position for the target pages and their query set. Watch crawl behavior if you can, because improved internal link structure often shows up as more consistent crawling before rankings move.
If rankings do not move but crawling improves, do not panic. It means the internal linking strategy did its job on discovery and equity paths. Now you are looking at content quality, intent match, or external competition. That is a different fight.
FAQ
So what is an internal linking strategy, in plain English?
It is a plan for who links to what, and why. We treat it like a routing map: donor pages that already get crawled and earn attention send contextual links to target pages that should rank, plus a hygiene pass so we stop bleeding authority into redirects and 404s.
The shortcut trap: can we just add a giant footer with links to everything?
Stop. We have watched this turn into a nav blob that users ignore and crawlers treat like noise. It also smears topical relationships because everything links to everything. Put the structure in hubs and clusters, then add contextual links where the topic is actually being discussed.
What counts as an internal link, and which ones actually move the needle?
An internal link is any link from one page on your domain to another: main navigation, breadcrumbs, in-body links, related widgets. The ones that keep paying us back are in-body contextual links from relevant donor pages, plus clean navigation paths that keep priority pages at 3 clicks or less.
We added links and nothing happened. What did we probably mess up?
Our first miss was stupid: half the new links pointed at redirected URLs, and two priority pages were still orphaned because the only links were buried in a paginated archive. The usual culprits:
– Donor pages are not crawled much (or they rank for irrelevant queries)
– Links point to redirects or chains, not the final URL
– Targets are canonicalized somewhere else or blocked by robots
– You only used one donor, so one template change nukes the whole setup
Re-crawl within a week or two. If the graph did not change, rankings will not either.