Back to Blog
AI WritingApril 10, 202616 min read

Small SEO Tools keyword position, how to track rankings

Dipflowby Ivaylo, with help from Dipflow

We ran a SmallSEOTools keyword position check, got a clean number back, and still ended up arguing in Slack because Google “looked different” on three separate laptops. That is the whole problem with small seo tools keyword position tracking: the number feels like a verdict, but it is really a snapshot taken under conditions you did not fully control.

Most people do one check, see a rank that does not match Search Console, then start changing titles, swapping internal links, or buying backlinks out of panic. We have done that too. It is expensive whiplash.

What actually matters is understanding what that rank number represents, how to reproduce it, and how to turn a free checker with a few brittle inputs into something you can trust week after week.

What SmallSEOTools is really telling you when it returns a “position”

SmallSEOTools (the “Keyword Position Checker” or “Keyword Rank Checker”, the naming changes depending on the page) does a straightforward job: it scans a Google results set for a keyword and tries to locate your domain inside that set. If it finds you, it reports a position like #1, #7, #90, #100. That number is a SERP rank position for that moment.

Two details get glossed over on tool pages, so we will say them plainly.

First, the output is not a promise about what every searcher sees. It is what the tool saw, given the Google instance you selected and whatever context the request effectively used.

Second, the tool is binary about discovery. If it checks the top 50 and you are at #73, you might as well not exist. The tool did not “miss” you. It stopped looking.

This is why the results depth selector matters so much. On some UIs and mirror pages you will see a “Check positions up to” list with options like 50, 100, 200, 250, 300, 400, 450, 500. That knob changes the question you are asking. “Where do I rank” is not one question. It is “Do I rank in the first X results, right now, under these settings?”

Why your SmallSEOTools position won’t match what you see in Google (and why that is normal)

This is the pain point that causes the most bad decisions. We see it constantly: a business owner checks their keyword in their own browser, sees themselves at #3, runs the checker and gets #11, then opens Search Console and it says “Average position: 6.4.” Three numbers. One keyword. Everyone panics.

Here is the reconciliation framework we use when we want to stop guessing.

SmallSEOTools position is a single snapshot rank: one query, one moment, one data center, one set of assumptions. It is closer to “exact current position” than Search Console, but it is still not universal truth.

Search Console position is an impression-weighted average. If you rank #2 for users in one city and #18 for users in another, Search Console can land in the middle. It also blends device types, different result layouts, and query variants that Google considers “the same” or “close enough.”

Manual checks in your browser are the least trustworthy unless you neutralize them. Your location, your search history, whether you are logged in, your device, and even whether you tend to click certain brands can shift what you see. Google is not shy about personalization.

We keep this mental model: the free rank checker is a thermometer, Search Console is a weekly health report, and your own browser is a funhouse mirror unless you lock it down.

A comparison that actually helps (no tables, just the truth)

When we are training a new teammate, we make them write this down in a doc because it prevents bad interpretations later:

  • SmallSEOTools rank: a point-in-time position for a selected Google TLD, at a chosen depth, based on the domain match the tool detects.
  • Search Console average position: an aggregate across impressions, often across pages, devices, and query close-variants.
  • Manual Google check: a personalized SERP unless you take steps to remove signals.

If you remember nothing else: Search Console is not “wrong” when it disagrees with a snapshot tool. It is measuring something different.

A repeatable validation method (so you stop second-guessing every number)

When a number looks suspicious, we do a controlled re-check. It takes five minutes and saves hours of doomscrolling.

We use the same keyword, the same Google TLD/country, and we try to make “manual Google” behave more like a neutral query.

Pick the country first. If you sell in Australia and you check Google.com out of habit, you are manufacturing drama. Select the right Google instance, or at least the closest Google TLD to the market you serve.

Then run three views of the same query:

1) SmallSEOTools with the correct Google selection and a depth that is likely to include you (start at 100 if you are not sure).

2) An incognito window, logged out, with location signals reduced. If you cannot set a location parameter cleanly, use a VPN endpoint that matches your target region, and do not bounce between endpoints mid-check.

3) Search Console for the same query and the same page if possible. If Search Console shows multiple pages for the same query, that is a clue that Google is rotating URLs and your rank will wobble.

What trips people up is expecting these to match to the digit. They will not. We treat a 5 to 20 position swing as normal noise when you are outside the top 10, especially if different URLs are ranking on different days. Inside the top 10, big swings are less common, and we start looking for a reason: a SERP feature appearing, a location mismatch, a page swap, or an indexing issue.

A red flag is consistency in the wrong direction. If every neutral check says you are around #40 and your internal screenshot says #7, your screenshot is the outlier, not the internet.

Setup that prevents bad data before you run anything

Most rank checker “guides” skip setup because it is boring. Setup is where rank tracking becomes either useful or a weekly self-inflicted wound.

Start with the market. SmallSEOTools lets you select a Google search engine instance (often defaulting to Google.com). If you are targeting a specific country, pick the matching Google TLD or country option. Otherwise you are testing the wrong battlefield.

Now decide what “counts” as a rank for your domain. This sounds philosophical until you watch your homepage bounce in and out while a product page quietly holds steady.

If you expect the homepage to rank for a keyword, but Google prefers an inner page, the tool may still report a rank for your domain. It just will not match your mental model of which URL “should” be winning. We have seen teams call a ranking “bad” because the homepage was not the result, even though the ranking page was perfectly fine and converted better.

So pick a rule:

If the goal is brand coverage, track the domain and accept whichever URL ranks.

If the goal is a specific landing page, you need to log the landing URL the tool finds (or manually confirm it) so you can spot URL swaps. URL swaps cause phantom “rank drops” that are really just Google choosing a different page.

Last setup detail: build a keyword list that matches intent. A keyword that sounds similar can have a different SERP shape. “Best CRM for real estate” is not the same fight as “real estate CRM pricing.” If you mix intent types in one list, you will misread progress because different pages should rank.

Running the check without breaking the tool (the input rules are petty)

The workflow is simple: enter a domain/URL, choose the Google instance, paste keywords, click “Find Keyword Position” or “Check Position.” The annoying part is how easy it is to feed it bad input.

The tool expects one keyword per line. Not commas. Not semicolons. Not a paragraph.

We have watched smart people paste “keyword A, keyword B, keyword C” and then spend ten minutes wondering why the positions look random. The tool did not interpret three keywords. It interpreted one long keyword phrase.

There is also a batch limit per run: up to 20 keywords per check. If you have 200 keywords, you are doing ten runs. That sounds tedious, but you can make it manageable.

We keep our master keyword list in a sheet, then create a “batch” column that numbers them 1 to 20, 21 to 40, and so on. Copy exactly 20 lines at a time. Run the check. Paste the output back into the same sheet with the date.

Yes, it is manual. It is still less work than trying to debug why a giant paste produced nonsense.

The messy middle: depth, “not found,” and the moment you realize you might be at #438

This is where rank checking stops being cute.

You run the tool, you get blanks or “not found,” and you immediately assume the checker is broken. Sometimes it is. Often it is doing exactly what you asked: it looked at the top 50 and did not see you.

On some interfaces, you can choose how deep to scan. Those depth options (50 up to 500) are not cosmetic. They are your diagnostic tool.

A troubleshooting flow we actually use

When a keyword comes back with no position, we do not jump to conclusions. We walk it.

Start at 50. If not found, go to 100. Still nothing, go to 200. If you are doing competitive terms, it can be worth going to 500 once, just to see whether you exist at all.

Then confirm the domain variant the tool is looking for. This sounds dumb until it bites you. We have seen tools treat these as different strings:

  • http vs https
  • www vs non-www

If your site redirects cleanly, Google usually consolidates it, but a rank checker that is string-matching domains can still get confused. If you entered https://example.com and the result is showing www.example.com, or vice versa, the tool might miss it depending on how it compares.

Now check if a different page ranks. If you only care about domain presence, you are fine. If you care about a specific URL, you need to know when Google swapped the ranking page. That swap can explain a sudden position shift.

Also confirm the keyword itself is not being interpreted differently. Small wording changes can flip the SERP from informational to transactional. That changes what can rank. If you are seeing absurd results, we sometimes copy the exact keyword line and paste it into a plain text editor to make sure there is no hidden character or extra whitespace. We have lost time to that. More than once.

How we interpret the result ranges (and when to stop checking)

There is a point where deeper scanning stops being useful and starts becoming a coping mechanism.

If you are not found within the top 100, we treat it as “not competitive yet” for that keyword, unless there is a clear reason you should be ranking (like you are the brand the keyword names). At that point, obsessing over whether you are #173 or #312 is not the best use of time.

If you only appear at 400 to 500, that is usually a sign you have a fundamental mismatch: technical problems, indexation weirdness, or content that does not satisfy the intent. Link building at that stage often just burns money because you are pushing the wrong page.

If you are between 21 and 100, rank checking becomes useful because you can often move with on-page improvements, internal linking, and better intent match. The SERP is telling you that you are in the conversation. You are just not winning it.

We are opinionated here: people love the dopamine of “building backlinks” because it feels like action. If you are buried at #450, action is not the problem. Direction is.

Turning a single check into a tracking system that does not waste your life

If you only run the tool when you are anxious, you will only ever learn that rankings are volatile. That is not insight.

A tracking system is boring on purpose. We do it the same way every time so the numbers can mean something.

Cadence: daily vs weekly (and why “daily” is a trap for most teams)

Small movements happen constantly. Google tests, rotates, and personalizes. If you check daily, you will be tempted to attribute every wiggle to the last thing you touched on the site.

For most small teams, a weekly snapshot beats daily checking. Pick a day and time, use the same Google TLD, the same depth rule, and record it. If you are in a launch window or you just fixed a technical problem, daily checks for a short period can be useful. Otherwise, weekly keeps you sane.

We sometimes compromise with a 7-day median: run a check every day for a week when you care about a specific keyword, then take the median position as your real number. It filters out noise. It also forces you to confront whether a “drop” was one bad day.

A spreadsheet template that survives reality

We keep it lightweight. The point is consistency, not perfection.

Our columns are: date, keyword, Google TLD/country, depth used, reported position, landing URL found, and notes.

Notes is where the truth lives. “Published new section,” “changed title,” “rolled out internal links,” “site downtime,” “migrated to https,” “Google showing map pack now.” If you do not log changes, you will invent stories later.

What nobody mentions is that you should also log when the checker itself behaved oddly. If one run returns strange results across many keywords, we mark it. Free tools can get rate-limited, served different results, or just glitch.

How we decide if movement is real

We use simple thresholds by range:

Top 1 to 3: tiny changes matter, and SERP features matter even more. You can be “#2” and still lose half your clicks to a big featured snippet.

4 to 10: improving titles and intent match can move you, but competitors are usually competent.

11 to 20: this is the sweet spot for focused work. Internal links and content upgrades often pay off.

21 to 50: you are visible enough that technical issues and weak content structure get punished.

51 to 100: you are on the edge. Sometimes you are one solid page refresh away from a jump.

Beyond 100: track it, but do not let it run your week. Pick a plan and execute.

What to do after you see the numbers (a prioritized ladder, not a motivational poster)

Once you have positions you trust, the next move depends on where you are.

If you are already top 3, your work is defensive. Watch for SERP layout changes, keep the page fresh, and protect the snippet. We have lost #1 spots to a competitor who simply answered the question more directly in the first 200 words.

If you are 4 to 20, start with intent match and page quality before you touch backlinks. Refresh the content so it actually satisfies the query, tighten headings, add missing sections that competitors cover, and improve internal links from relevant pages. Then worry about links.

If you are 21 to 100, assume the page is not earning enough trust or not signaling relevance clearly enough. We usually fix technical and on-page issues first, then build internal links, then decide whether external links are worth it.

If you are beyond 100, we ask one blunt question: do we even have the right page? Sometimes the answer is “no,” and the correct action is to create a new page with the right angle instead of trying to force an existing one to rank.

RankBrain gets name-dropped a lot in SEO advice, including on some tool pages that call it Google’s third most important ranking signal. We are not going to pretend we can tune a knob labeled “RankBrain.” What we can do is stop publishing pages that make users bounce because the intent is wrong. That is the practical version.

Anyway, we once spent an entire afternoon chasing a “ranking drop” that turned out to be a coworker checking from an airport Wi-Fi location. We still tease them about it.

Trust and safety checks: don’t get fooled by mirrors, ads, and fake features

Free SEO tools live in a swamp of clones and mirrors. We have landed on pages that look right, run right, and still feel off.

A quick sanity checklist helps:

  • Make sure the page has the actual workflow: domain/URL input, Google selection (often default Google.com), keyword box that expects one keyword per line, and a clear “Find Keyword Position” or “Check Position” button.
  • Be skeptical of mirror pages with placeholder “Lorem Ipsum” in the “About” section. That is a sign the page is a template, not a maintained tool.
  • Ignore affiliate promos that look like part of the tool, like “Buy 1 month of Semrush Pro… consultation worth $300.” That is advertising, not a feature.
  • Remember site-level claims like “145+ tools” and “100% free” are platform marketing. They do not tell you anything about how reliable today’s rank result is.

If you are seeing different limits, different outputs, or odd extra metrics like CTR and competition level, you might be on a third-party write-up or a clone that layers in guesses. The core SmallSEOTools checker is about locating a domain’s position in the SERP. Treat anything else as unverified unless you can reproduce it.

The bottom line: treat the number as a measurement, not a prophecy

SmallSEOTools keyword position checks are useful when you treat them like lab readings: controlled inputs, consistent settings, and a record you can compare over time. The tool does not need to be fancy to be valuable. It needs to be repeatable.

If you choose the right Google TLD, respect the one-keyword-per-line rule, work within the 20-keyword batch limit, and use results depth as a diagnostic instead of a panic button, you can get a ranking signal that is good enough to guide real work. Then you stop chasing ghosts and start tracking progress.

FAQ

Why doesn’t SmallSEOTools keyword position match what I see in Google or Search Console?

SmallSEOTools reports a single snapshot rank under specific settings. Search Console shows an impression-weighted average across devices, locations, and sometimes multiple pages. Your own Google results are often personalized unless you remove location and history signals.

What does “not found” mean in SmallSEOTools keyword position checker?

It usually means your site was not detected within the results depth you selected (for example, top 50 or top 100). Increase the depth to confirm whether you rank farther down, or confirm you entered the correct domain variant (www vs non-www, http vs https).

How do I use SmallSEOTools to track keyword rankings over time?

Run the same check on a fixed cadence (usually weekly) with the same Google country/TLD and the same depth. Record the date, keyword, depth, reported position, and the ranking URL so you can separate real movement from URL swaps and noise.

How many keywords can I check at once in SmallSEOTools keyword position?

Typically up to 20 keywords per run, entered one keyword per line. If you have more, split your list into batches and paste results into a spreadsheet with the check date.

google search consolegoogle tld targetingkeyword rank checkerrank trackingserp volatilityurl swaps
SmallSEOTools Keyword Position Tracking Guide - Dipflow | Dipflow