Knowing what keywords a website is “ranking” for is one of the most useful tasks in SEO, but it changes completely depending on whether we're talking about your own website (you have real data) or a competitor (you'll only have estimates). In this guide, you will draw up an actionable list of keywords, their relationship with URLs and a prioritization method to decide what to touch first. If you work with a SEMrush-like suite like Makeit Tool, you can accelerate research and monitoring, but the basic process is the same.
Purus suspended the ornare non erat pellentesque arcu mi arcu eget tortor eu praesent curabitur porttitor ultrices sit sit amet purus urna enim eget. Habitant massa lectus tristique dictum lacus in bibendum. Velit ut Viverra Feugiat Dui Eu Nisl Sit Massa Viverra Sed Vitae Nec Sed. Never ornare consequat Massa sagittis pellentesque tincidunt vel lacus integer risu.
Mauris has arcus lectus congue. Sed eget semper mollis happy before. Congue risus vulputate neunc porttitor dignissim cursus viverra quis. Condimentum nisl ut sed diam lacus sed. Cursus hac massa amet cursus diam. Consequat Sodales Non Nulla Ac Id Bibendum Eu Justo Condimentum. Arcus elementum non suscipit amet vitae. Consectetur penatibus diam enim eget arcu et ut a congue arcu.

Vitae Vitae Sollicitudin Diam Sede. Aliquam tellus libre a velit quam ut suscipit. Vitae adipiscing amet faucibus nec in ut. Tortor nulliquam commodo sit ultricies a nunc ultrices consectetur. Nibh magna arcu blandit quisque. In lorem sit turpis interdum facilisi.
Vitae Vitae Sollicitudin Diam Sede. Aliquam tellus libre a velit quam ut suscipit. Vitae adipiscing amet faucibus nec in ut. Tortor nulliquam commodo sit ultricies a nunc ultrices consectetur. Nibh magna arcu blandit quisque. In lorem sit turpis interdum facilisi.
“Nisi consectetur velit bibendum a convallis arcu morbi lectus aecenas ultrices massa vel ut ultricies lectus elit arcu non id mattis libre amet mattis congue ipsum nibh hate in lacinia non”
Nunc ut Facilisi Volutpat Neque Est Diam Id Sem Erat Aliquam Elementum Dolor Tortor Commodo et Massa Dictumst Egestas Tempor Duis Eget Odio Eu Egestas Nec Amet Suscipit Posuere Fames Ded Tortor Ac Ut Fermentum Odio ut Amet Urna Possuere Ligula Volutpat Cursus Enim Libero Pretium Faucibus Nunc Arcu Mauris Sceerisque Cursus Felis Arcu Sed Aenean Pharetra Vitae Suspended Aenean Pharetra Vitae Suspends Ac.
“Before obsessing over 'what keywords to rank', make sure you answer the right question: What URLs are capturing demand and where is there real opportunity (impressions + position 8—20)? Search Console is the truth of your website; the rest are useful estimates, but not judgment.”
If the website is yours (or you have access), Google Search Console is the starting point because it shows real queries with clicks, impressions, CTR and average position. It's not an “estimate”: it's the performance observed by Google for your property. It's the most defensible dataset for prioritizing changes and measuring impact.
If you're a competitor, you can't see their real Google data. SEO tools will give you an estimated list of keywords and URLs that “seem” to rank, based on their own samples and databases. This is to understand your demand map and detect opportunities, but not to ensure exact figures.
In third parties, you won't see all the keywords and many long-tails will be left out. It is also common for the position to “dance” depending on country, device and customization. In Search Console, 100% of queries for thresholds and privacy don't appear either, but it's still the most useful reference for decisions about your site.
What will you get at the end of the process
When you're done, you should have three things: a list of queries with metrics, a clear view of which URLs are capturing what intent, and a shortlist of prioritized opportunities (quick wins, low CTR, cannibalization, and topics that deserve new content).
First make sure you're looking at the appropriate property (domain or URL prefix). If you have several (http/https, subdomains, etc.), choosing the wrong property is a typical cause of “I don't see my keywords”.
Go to Performance and open the report of Search results. On the tab Consultations you will see the keywords. Activate these metrics and understand what they tell you: clicks are views from Google, impressions are visibility, CTR is snippet efficiency and average position is an average (indicative, not a fixed ranking).
Work with time windows that have a signal. If your website has low traffic, 90 days will give you more stability than 28. If you're measuring recent changes, 28 days is useful, but always comparing equivalent periods.
To attribute a keyword to a page, use the “Pages” cross. In the same report, go to the tab Pages, select a specific URL, and return to Consultations. This filter shows you the keywords that activate that URL and allows you to understand if that page is well focused or mixes intentions.
If when you filter a URL you see that it ranks for queries of very different types (for example, mixed definitions and comparisons), it is usually a sign that there is a lack of structure, missing sections, or the page tries to respond too much. In these cases, the job is usually not to “change the title”, but to reframe: to clarify the main intention and push subintentions to satellites.
Cannibalization usually occurs when several pages on the site compete for the same intention. You can detect this by checking if different URLs appear for very similar queries or if performance is distributed. The typical symptom is instability: sometimes one URL is ranked and sometimes another, with an irregular CTR.
For consistent analysis, filter by country and device if your business depends on a specific market or if you see strong CTR differences between mobile and desktop. It's also a good idea not to mix search types if your goal is organic web.
Export to CSV or Google Sheets and add fields that allow you to prioritize without looking row by row. The idea is to create simple “tags”: a position bucket (top 3, top 10, 11—20, 21+) and a suggested action (snippet, content, linking, consolidation). This turns the report into a to-do list, not an informational list.
When a query has a lot of impressions and is in positions 8—20, you're usually close to capturing more clicks. In these cases it usually works: reinforcing the content (subtopics that the top 10 covers and you don't), adjusting the title to better reflect the promise, and improving internal linking from related pages to the target URL.
A low CTR with lots of impressions can mean three things. It may be that the snippet is uncompetitive (title and meta don't differentiate or promise a specific result). It may be that the SERP has modules/competencies that absorb clicks and reduce the “natural” CTR. Or it could be that your content doesn't match the dominant intent. The correct action changes: in the first case, you rewrite title/meta; in the second, you reconsider the angle or attack a subintention; in the third, you reframe the content or create a URL for the correct intention.
Sometimes you'll see keywords in good positions with few impressions. This usually indicates low demand or very specific long-tail. It's not “bad”, but it's rarely the first thing you should optimize. The value lies in adding long-tail within a cluster and reinforcing the canonical page of the topic.
If you see impressions spread over similar URLs, positions that change without stability, and irregular CTR, it's a sign of internal competition. The typical solution involves defining a canonical one by intention, consolidating content if there are duplicates, reorienting subintentions and reinforcing internal linking to the URL that should be ranked.
A useful reading is to think by intention: which queries are informational, which are decision-making and which are navigation/branding. In general, decision pages usually justify improvements or new “evaluation” pages (comparisons, alternatives, choice guides), while informational pages should have a logical bridge to the next step (resources, checklists, or use case pages if it fits).
In an SEO tool, the usual flow is domain → organic keywords. From there, you filter by country, position (top 3, top 10, top 20) and check the URL associated with each keyword. The objective is not to copy a list, but to identify clusters: what topics work on, which pages are “hubs” and what intention they are capturing.
Rather than focusing on “what keywords”, check which pages concentrate visibility, because that tells you the type of content that works for them (comparisons, guides, resources, integrations, etc.). If the tool offers history, use it to detect what they have recently published or updated that may explain uploads.
Before assuming that “this keyword is good because the competitor ranks”, validate 10—20 real SERPs: see what format dominates (guide, landing, comparison), what type of domain you win (medium, SaaS, forum) and if the competitor's URL really responds to the dominant intention. This avoids publishing the wrong format.
Tools don't see everything, especially long-tail, and they can fail in specific positions. Use them for hypotheses and competitive mapping, and confirm important decisions with real SERP and with your own performance (when you have one).
site:dominio.com helps you discover sections, see which pages are indexed and find content by topic. It doesn't help you know which keywords bring impressions, or to see CTR or position. It's useful as a quick scan, not as a performance analysis.
If you don't have tools, you can use suggestions from Google (autocomplete) and PAA to understand how demand is formulated and what sub-questions exist. This doesn't replace a list of “keywords you rank for”, but it does help you create or improve content with real intent.
A simple monthly system reviews: top queries per cluster, top pages, opportunities (positions 8—20), abnormal CTR, and signs of cannibalization. With that, you can decide what to optimize and what to consolidate without improvising.
The important thing is not to look at data, but to record what you changed and when. If you write down the date and action (snippet, content, link, consolidation), you can compare periods equivalent to 28 or 90 days and attribute improvements with more credibility.
Monthly is usually sufficient for reporting and backlogging. Weekly is only recommended if you are undergoing migration, update campaigns or major changes where you want to detect anomalies soon.
A SEMrush-like suite like Makeit Tool can help you centralize competitive research: detect which clusters others are working on, which URLs capture intent and where there are gaps. The value lies in reducing exploration time and maintaining consistency in the analysis.
The step that makes the most difference is moving from “single keywords” to clusters based on intent and theme. That's where the analysis ceases to be informative and becomes a strategy: a canonical URL based on intent, satellites that cover sub-questions and internal linking that reinforces the target page.
The operational flow should always end in actions: updating, creating satellite, consolidating, strengthening links. And then measure in comparable windows (28/90) to know if the change worked, without depending on a specific day.
It's one of the most common causes of false conclusions. If your market is Spain, filter Spain. If not, the average deceives you.
Changing from 28 to 90 days or comparing windows with different seasonality makes it seem like you're “going up or down” for no real cause.
The average position is useful as a signal, but it's not the goal. To prioritize, combine impressions, CTR and position.
Optimizing a definition when the SERP is comparative, or vice versa, usually gives poor results even if “the keyword is good”.
Without the keyword → URL cross, you work blindly. A lot of decisions depend on which page is capturing what intent.
On competitors, validate important SERPs and use the tools as a map, not as absolute truth.
On mobile, CTR and behavior change. If there are strong differences, analyze separately.
Not completely and reliably. Without access to Search Console, you'll only have estimates or samples. For free, you can get clues with manual SERP, site: and top content analysis, but not an exhaustive list with real metrics.
It's normal. There are thresholds and privacy, and in addition the period and filters can hide data. Expand to 90 days for more signal, review per page (filtering a URL) and compare equivalent periods if you're evaluating changes.
It depends on the objective, but to prioritize it is often more useful to look at impressions (opportunity), CTR (efficiency) and position (closeness) together. A keyword in position 12 with a lot of impressions is usually a better opportunity than a keyword in position 3 with few impressions.
In Search Console, filter by page and return to queries, or filter by query and view pages. In SEO tools, look for the “URL ranking” column associated with the keyword. The key decision is to define a canonical one by intention and reinforce it.
Monthly is the most sustainable thing for reporting and backlog. Weekly only if you are in a phase of intensive changes (migration, major updates, content campaign) and you need to detect deviations soon.
Take advantage of all the resources we offer you to build an enriching link profile.