To create a website’s SEO strategy , it is important to carry out an orderly analysis of the maximum number of factors that have to do with web optimization and positioning.
An SEO checklist can be the easiest way to carry out this complete review. We suggest one that you can adapt to your project and that we structure in the following blocks:
Initial setup
First of all, we are going to configure or review the existing configuration of our website, both on analysis and monitoring platforms, and on our own server.
1. Google Analytics
Take advantage of the Google Analytics system from the start to measure the complete evolution of visits to your website. Check that the tracking code is correctly inserted and that it reflects visits in real time.
In addition, you can implement the entire goal and conversion tracking system to see the value of each visit to your website.
2. Search Console
Search Console is the essential tool for all SEOs, with which you will have access to complete monitoring of your website and its status in the search engine.
But if you are passionate about data and want to get the most out of the information provided by Search Console , don’t stop there: with SEOcrawl you can go a step further and add analysis, reports and monitoring so that nothing escapes you.
3. Check different versions of URL
A website can have different versions:
- http://www.myweb.com
- https://www.myweb.com
- http://myweb.com
- https//myweb.com
None of them should give an error and all of them should direct to the main URL, which will be the one you decide, to avoid problems with duplicate content or 404 errors, among others. Confirm that all secondary URLs direct the user to the main URL through a 301 redirect .
4. Keyword Research
Keyword analysis is the cornerstone of any SEO strategy . With the following tasks, you can build a strategy with high profitability potential and a realistic growth plan.
Thanks to detailed keyword research we can identify:
- Core words for business
- “long tail” words to attack immediately
- words that the competition positions
- and much more…

5. Identify main competitors
There are two ways to analyse competition on the Internet. On the one hand, the real competition, those companies that are part of the sector in your same niche, which you probably know and whose current position in the search engine you must identify.
On the other hand, those who occupy the top positions in the key searches that interest you. They may not be sector competition, but they are organic, so they have to fit your objectives.
You can search for competition by taking advantage of different tools:
- Your knowledge of the sector.
- Direct searches in the search engine.
- Market analysis with specific tools and software, such as comScore, Semrush, Woorank or Ahrefs, among others.
6. Identify industry leaders
It is easy to identify leaders in a business sector; they are often well-known, international brands.
It can be difficult to match them, but they serve to establish an objective, see what this sector can offer both commercially and SEO-wise , and, in the process, pick up some ideas that we can adapt to our own business.
7. Define core keywords of the project
The most difficult task of all is to identify the main keywords (core keywords) of our project and, once established, we must set an optimization on them. This process is fundamental since the success and surely the income of the business depend on them: no matter how many keywords you have positioned, surely the business flow and the conversions come from a group of specific keywords, which we call “core”.
On the other hand, we must also start from these keywords to detect new secondary keywords , which can also offer us profitability.
Choosing these main keywords should be done by taking advantage of all the resources at our disposal. Here are some of the most useful ones:
- Again, knowledge of the sector.
- Keyword tools like the one offered by Google Keyword Planner.
- Keyword comparison tools such as Google Trends.
- SEO keyword analysis tools, such as Ahrefs, Semrush, Sistrix, keywordtool.io, etc.,
Once you’ve identified them, tracking them is easy with SEOcrawl: within Tags, set the conditions for a keyword to be considered core to your business, then use the advanced filters in Top Keywords to save a smart view with those conditions.

From this point on, you will only have to choose the Core option within your list of Smart Views to analyze that set of keywords. The best thing? From the same dashboard you can set the alerts you consider relevant so you don’t have to keep an eye on potential ups and downs every day. Easy, right?

8. Initial photography. Starting point
With all these parameters we can establish what is known as the “ initial photo ”. It is like a photo of the beginning of a race.
Where is your company currently at? Has it made any progress in optimisation or positioning? Has it so far surpassed any rival in the sector? Establish your starting point and, based on this, set the first SEO objectives .
Note : If you connect your project to SEOcrawl , you will be able to immediately see detailed performance developments over the past 12-16 months.

On-page SEO
On-page SEO is the optimization that is carried out on everything that we control within our own project. Therefore, it is up to us exclusively to analyze it and improve it whenever possible.

9. Sitemap
Ensure your website is indexed as quickly and completely as possible with a sitemap.xml . You can do it by hand, if you have programming knowledge, or take advantage of various plugins that create them in a single click.
Once you confirm that it is created and uploaded, you must link it from Search Console. It is not mandatory, but it is a very useful measure to monitor that the communication between your website and the search robot is fluid and effective.

10. Robots.txt
In most cases, the robots.txt file is only used to tell the search engine which pages are indexable on the web . A correct configuration will help you avoid crawling problems on pages with little SEO value or even potentially dangerous ones.
In these cases, or if your project requires a more complex Robots.txt, it is essential to have it and upload it correctly.
11. Web architecture
On-page SEO relies heavily on usability, and this factor changes as navigation standards and technology evolve.
Nowadays, the optimal architecture of a website is one that has:
- a clear and accessible menu
- content that can be reached in the fewest number of clicks possible (ideally, 4 at most)
- access to all parts of the website in an intuitive and fluid way.
From an SEO point of view, this architecture must be user-friendly and focused on the stated objectives , highlighting the most relevant products or services and avoiding sections of little value occupying areas that Google considers important. It is important that this architecture is as clear as possible and that it always makes clear to the crawlers the importance of each part within it.
12. Friendly URLs
The URL is today one of the most important elements in on-page SEO optimization, which is why its analysis and continuous improvement will represent a great deal of effort within the strategy.
A friendly URL is one that includes primary and secondary keywords, related to the content of the page and framed within our keyword strategy.
The tasks related to this optimization are as follows:
- Review of unfriendly URLs.
- Optimization of existing friendly urls.
- Creating new friendly URLs.
Example :
- Incorrect : example.com/blog/category/post?=34645
- Correct : example.com/blog/travels/andalucia-2-days
13. Titles
The Title is the title of the page that appears in search results when the website is displayed for a specific keyword.
Title optimization is more complex than it seems, as it must respond to SEO objectives, help promote the brand and respect the limits imposed by Google.
14. Descriptions
The descriptions of all pages follow similar rules to those of the title, except that by having a greater number of characters, they have more options for optimization and creativity.
Although meta descriptions are not an SEO ranking factor, they can have a decisive influence on the CTR , getting more users to click on your result, so it is very interesting to take full advantage of all their potential.
For the optimization of the “title” and “meta-description” tags, as well as “Alt”, “headers”, etc., which we will see later, we recommend the SEOcrawl extension for Google Chrome , very practical for identifying these and other values for each URL we are browsing.

15. Breadcrumbs
Breadcrumbs are navigation links present on a web page, which serve both to indicate the location to the user and to allow them to navigate between previous sections or categories.
In SEO, breadcrumbs offer both usability and keyword optimization, as each “breadcrumb” allows for the inclusion of a keyword.
Additionally, breadcrumbs allow you to introduce schema.org-specific breadcrumb markup, which helps Google better understand the hierarchies in your architecture.

16. ALT tags
In SEO, every factor is important and the sum of the whole is what determines which website is best optimized. That is why you should also pay attention to ALT tags , an attribute with a double function that you can optimize with the keywords that interest you the most: it is the text that will be displayed in the event that the image does not load, so it helps both Google and the user to understand the content of the image.
Certain SEO tools, such as crawler programs like Screaming Frog or similar, can help you identify ALT deficiencies in certain parts of the website so that you can optimize them much more quickly.
17. Correct heading hierarchy
Headings are not only useful for the user, but also have an important role in web design. Users are increasingly scanning texts, looking for the paragraph that best fits the exact information they need.
By using them correctly, that is, using:
- a single H1 per page that defines the main idea
- several H2s that support those semantic relevant kws,
- H3 that provide information, extra, etc.,
We will be helping not only the user, but also Google to better understand the content of the website.

Analyze your website and make sure that all headings have been implemented correctly: it is very common to fail at this point for design reasons ( remember that the function of headings is to organize information, never to layout content , for which we have other visual resources), often using a large number of H1s per page, which can confuse Google spiders.
18. Images
Images are graphic resources that can visually enrich a web page, but you must pay attention to their optimization.
On the one hand, you should reduce their weight as much as possible , without affecting their quality . On the other hand, you should take advantage of their optimization, since they have alternative texts and titles, they can be linked, have their dimensions encoded, etc.
Finally, don’t be tempted to overload your content with too many images, as this can affect the loading of your website and reduce the usability of your site.
19. Internal links
Internal link juice serves to boost the positioning of main pages and, sometimes, to derive part of their authority towards those that have less weight , as long as an appropriate strategy is drawn up and the links are dofollow, so that the search engine takes them into consideration.
Prioritizing the main pages and designing an internal link distribution strategy is key to ensuring that they receive the highest possible authority, thus avoiding less relevant pages having much more weight than them.
20. Mobile Friendly
With the rise of mobile device usage and the announcement by Google of the Mobile First Index (whereby the mobile version of a website is considered the main one and is the first to be indexed by the search engine), it is unnecessary to announce that any website must be multi-device and this means that it must be “Mobile Friendly”, that is, accessible, fast, usable and functional from any mobile phone.
We can easily check this from Search Console or through the Lighthouse tool.

21. Hreflang
Hreflang tags are used to tell Google that there are different versions of a website in different languages. They are necessary for multi-language websites and must be correctly implemented, according to Google’s guidelines.

It is very common to make mistakes in its implementation and, in the case of very similar languages, such as Spanish with Latin American languages, it can even lead to duplicate content.
You can check that they have been implemented correctly with the Google Chrome extension Hreflang Tag Checker or with the SEOcrawl SEO extension, which will analyze all the implemented versions and inform you if there are any errors in any of them.

22. Correct use of pagination
Pagination is one of the on-page elements that SEO professionals pay the least attention to, when in fact it is of great importance.
Correct pagination is not only functional, it must also be optimized . Common errors include:
- Use canonical from all pagination to the first page (this is serious, since each page has different content)
- Prevent Google from crawling them correctly
- Add them via JS code, making them unreadable
23. Filters
Filters can become a key part of a website’s SEO, especially in e-commerce. If you have a page that includes them, you should try to optimize them correctly and decide whether you are going to work on SEO on the resulting pages or if you prefer not to index them and prioritize the pages of categories or products that interest you the most.
It is an important decision, but in any case, we recommend that you optimize each page well, so that it always has the potential to be indexed and is attractive to the user.
Incorrect use of filters can generate hundreds of thousands of URLs resulting from combinations between them, causing Google to waste time crawling and indexing these pages, even above your own main products, which can weigh down the entire project.
24. Indexing analysis
From the first day you allow indexing, you should analyze how it evolves from Search Console . The Google tool will identify possible errors and detect low-quality indexed pages, among other issues. In addition, you can perform the same analysis (with some extended functions) from the SEOcrawl Indexing option.

The objective of indexation analysis is both to confirm that the URLs we are interested in are indexed correctly and to try to correct any errors or lack of optimization that the others may be suffering from.
25. 404 Errors
A 404 error occurs when the content or resource that the user is trying to access is no longer available . 404 errors have a very bad reputation, sometimes deservedly so, when in fact they represent an opportunity.
With tools like Search Console or any URL tracking plugin, you can identify if there are broken links on your website that are leading to a 404 error page.
SEO optimization consists of deciding whether to redirect that page to an equivalent one or to create an optimized 404 page , with which you offer the user other entry options and avoid losing their visit.

Having 404 errors on a website is not bad per se , but Google identifies it as something natural; it is the fact of not dealing with them and that they have more and more presence on our site that can become a problem.
26. Soft 404 Errors
Google mainly identifies these errors as incorrect redirects, that is, redirecting old pages that no longer have value or exist en masse to the home page : when a redirection occurs, Google wants it to be to an equivalent page, if we do it systematically to the home page, they will end up being considered as soft 404 errors.
Imagine you’re in a physical store and you’ve walked all the way to the end of the store looking for a particular model of shoes, but when you get there they tell you that they’re out of stock and, instead of offering you something similar, they take you back to the door of the store so you have to walk around looking for them again. Wouldn’t that be appropriate, right? Well, on a web level it’s something similar.
27. Redirects – Redirect Chains
Be very careful with redirect chains, i.e. redirects of other redirects . On the one hand, they will increase the loading time of your website and, in addition, they can motivate the search engine robot to stop crawling through them.
Example:
- http://example.com (redirect 1 with 301)
- https://example.com (redirect 2 with 301)
- https://www.example.com
28. Optimized and customized 404 page
As we mentioned before, a 404 page can be an opportunity to deal with any error encountered by a user visiting your website.

Be as creative as you can, take advantage of SEO-optimized and promotional text, and offer a list of links that are valid for the user, which encourages them to continue on the website.
29. Canonical
The canonical URL or canonical link is used to avoid duplicate content. It tells Google which URL is the original and which may be a derived copy, for example, in online stores.
Use canonical links correctly with the structure:
<link rel=»canonical» href=»http://www.miweb.com/pagina» />
and make sure your website does not suffer from duplicate content.

30. URL cannibalizations
Cannibalization occurs when two or more URLs try to rank for the same keyword . In SEO, it usually represents the waste of one of the two, but it can have more serious consequences, such as not ranking any of them or giving a bad image to the user.
With SEOcrawl’s Cannibalizations feature you can identify which URLs are trying to rank for the same keywords and should be optimized independently.

Technical SEO
Within this section of our SEO checklist we have added some points that require more advanced knowledge, but which constitute a fundamental part once we have the rest of the points well optimized.

31. Rendering
With Search Console we can check if our pages are rendered correctly, which will allow us to check if the crawlers are having any difficulty loading and correctly understanding the content of our URLs . It is very common, especially when we start using JS-based technologies, for spiders to have difficulties when rendering the content of a page.
Using Search Console, tools like Screaming Frog or software like SEOcrawl (Crawler function), we can obtain a comparison between the code and what Google actually sees.

32. Rich Snippets
Rich Snippets are rich code fragments, based on Schema.org data markup , that appear prominently in search results .
Not all URLs can take advantage of Rich Snippets, but those that can enjoy their potential benefits should not miss out on them.
33. Open Graph Tags
Social networks have popularized Open Graph tags , which are used to identify which elements can be shared on these platforms. In other words, how we want them to look once we share content on social networks. You can also check them from SEOcrawl’s Crawler feature.

34. Server Log Analysis
Technical SEO reaches the server. Log analysis allows us to read what is really happening on our website, that is, what exactly Google spiders are doing on our site , thus discovering problems with crawl budget, thin content, tracking, etc.
Log analysis is a very important part, since thanks to crawling programs we have a simulation of what happens on our website, but only by comparing it with the logs will we have a global image of our site .
35. WPO
Web Performance Optimization basically consists of analyzing all the elements that affect the loading time of a website and optimizing them to 100% efficiency.
Google PageSpeed Insights is a free and highly detailed tool that can help you with this important task of technical SEO. We recommend comparing its results with those of another similar tool, such as GTMetrix, to detect all the points of speed loss.

SEO Content
Content is king for Google and, consequently, for its search and indexing robot as well. That’s why you need to optimize the content of all your pages as much as possible if you want to outperform your competition.
When analyzing your website, confirm that the content meets these requirements.

36. The text is structured in paragraphs
Reading usability is an important SEO factor . Structure the text on your pages into paragraphs so that it is attractive and user-friendly.
37. The main question is answered at the beginning
Not only is it an excellent resource in terms of information, but Google values it to the point of having created a “ zero position ”, which rewards websites that answer questions directly and concisely. Also, always remember to validate and cite your figures and data so that everything has the greatest credibility and security possible.
38. Bold type is used
Bold text is still a very useful factor, as it allows the user to quickly “scan” the content and quickly access the section that really interests them. A text with well-used bold text is a useful text and, therefore, has a better chance of ranking.
39. Questions are answered
Google’s “position zero” is also intended for pages that answer questions users have . If you place this answer at the top of the page, you have a better chance of reaching it.

40. Multimedia content is used
Without loading the weight of the website, you should take advantage of multimedia content to enrich the rest of the page and have more optimized elements.

41. Thin Content
The concept of Thin Content refers to low-quality content, usually due to the absence of sufficient textual content or because, in fact, what exists has very little informative value.
Identifying and optimizing these pages is key, as they can lead to penalty issues if Google considers that you have an excess of them.
Off-page SEO
In the analysis of your website’s SEO positioning, off-page SEO responds to those influential elements that occur outside your website . Despite this, you have many optimization options that you can take advantage of.

42. Incoming toxic links
You can take advantage of many tools to analyze incoming links to your website, such as the official Google Search Console, or the best known in this branch such as Ahrefs, SemRush, MOZ, Majestic SEO…
It is important to have a considerable number of internal links, but quality is more important than quantity in these cases.
Toxic links come from websites with unrelated or even inappropriate content , which can negatively affect your SEO positioning. For this reason, you should invalidate them as soon as you detect them.
Whether they are unwanted toxic links or come from a failed SEO strategy, you must keep in mind that any link to your website must come from a site related to your topic, valid for search engines and, if possible, from an authoritative site.
43. Is there any disavow uploaded?
By uploading the disavow file to Google, we are indicating which of the incoming links we do not want the search engine to take into account . It is important to know that when you upload a new disavow file, the previous one is deleted, so it is very important to always check if there is a previous one in order to update it instead of deleting it.
44. Anchor text profile
You must create a natural anchor text profile that is suitable for your link building strategy . This must respond to different terms and include related keywords, avoiding focusing solely on the brand + main keyword combination. If the strategy is not natural and Google suspects unlawful techniques, it may end up invalidating the links you get.
45. Natural growth of inbound links
Our link building strategy must be “natural” in the eyes of Google, which means that we must avoid abrupt growth in the acquisition of incoming domains . If, for example, we create a website from scratch, it is not normal for it to receive a high number of links as soon as it is launched. Wanting to rush to increase our authority in a short time can end up being a dangerous weapon. It is better to have a more continuous growth without these jumps.
46. Checking most used metrics (DA/PA, TF/CF, DR,UR)
In SEO positioning, some of the metrics based on the most widely used tools worldwide, such as MOZ, Majestic SEO and Ahrefs, are often used as a reference.
DR/UR, DA/PA and TF/CF are the metrics related to each of them:
- DR (Ahrefs): Domain Rating. The Ahrefs tool provides us with this metric to measure the quality of a website’s link profile, through the quality and quantity of a website’s external links. It is currently considered one of the most reliable ways to measure the link profile. It ranges from 0 to 100.
- UR (Ahrefs): URL Rating. If the previous term referred to the total authority of the domain, this one is based on a specific URL. It goes from 0 to 100
- DA (Moz): The Moz tool provides us with its own metric, “ Domain Authority”, with which it measures the authority of a website through different factors, not only the quality and quantity of the links. It goes from 0 to 100
- PA (Moz): Another Moz metric, known as “ Page Authority ”, allows us to know the authority of a specific page.
- TF/CF (Majestic): The Majestic tool offers us the so-called Trust Flow and Citation Flow , another way of measuring the quantity (CF) and quality (TF) of the links. Remember that you should always try to capture links on websites where the TF is greater than the CF.
47. Has the competitor profile been analyzed?
Off-page SEO involves analyzing your competitors’ profiles. You must keep monitoring and evolve as your rivals do to find niches, opportunities and strategies to overcome them.
48. Correct distribution of links to different pages
Link building is the most important off-page SEO strategy, and also the most difficult. You must look for inbound links from third-party sites to different pages, not always the same one, or Google will identify an over-optimization.
Read Also
Bonus: Monitor your off-page strategy with SEOcrawl’s Links feature
In addition to working on growing your link profile, you should also monitor the links you have acquired to ensure that they are not lost or that they change status between follow and nofollow, for example. To ensure that you do not waste time on this monitoring, SEOcrawl provides you with the Link Monitoring function, which takes care of doing this work for you.

With this complete SEO checklist , you will be able to analyze and keep your website in perfect condition to climb the Google rankings. It is important to analyze each one in detail, develop the necessary corrective measures if necessary and periodically review them.