SEO
-
SEO Basics12 Topics|1 Quiz
-
What is SEO
-
Google Algorithm For SEO
-
SEO Terms and Ranking Factors
-
Types of Search Engine SEO Factors
-
Content & Search Engine Success Factors
-
Site Architecture & Search Engine Success Factors
-
HTML Code & Search Engine Success Factors
-
Trust, Authority & Search Rankings
-
Link building & Ranking in Search Engines
-
User Context Signals & Search Engine Rankings
-
Toxins & Search Engine Spam Penalties
-
Emerging Verticals in Search
-
What is SEO
-
Semantic Core12 Topics|1 Quiz
-
What Is Semantic Core
-
Selecting Semantic Keywords
-
Commercial Keywords
-
Keyword Frequency and Density
-
Mid-Range Keywords
-
Low-Frequency Keywords
-
Low Competition Keywords
-
Competitors Research
-
Collect The Competitor`s Semantics
-
Analyzing Semantic Core
-
Keywords With Small Traffic
-
Relevant Similar Keywords
-
What Is Semantic Core
-
Keywords Clustering14 Topics|1 Quiz
-
What Are Keywords Clustering
-
Lemma-Based Clustering and Serp-Based Clustering
-
Keyword Research
-
Competitors Keywords Analysis
-
Find Keywords Ideas
-
Cheсking Keywords Data
-
Search Volume
-
Search Intent
-
Types Of Keyword Intent
-
Research Intent
-
LSI And Synonyms
-
Cost-Per-Click
-
The Relevance
-
Segment Keywords Into Groups
-
What Are Keywords Clustering
-
Website Structure11 Topics|1 Quiz
-
On-Page SEO55 Topics|1 Quiz
-
What Is On-Page SEO
-
Meta-Tags
-
Content
-
Text
-
Structural Text Elements
-
Graphics
-
Videos
-
Design
-
URL Structure
-
Internal Linking
-
Internal Links And Structure
-
Types Of Internal Links
-
Navigational Links
-
Contextual Links
-
Puproses of Using Internal Links
-
Internal Links Strategies
-
Cornerstone Content and Internal Linking Features
-
Internal Links Audit
-
Software For Internal Linking
-
Canonicalization
-
What Is a Snippet
-
Types of Snippets
-
Rich And Regular Snippets
-
Featured Snippets
-
Translating Content to Structured Data
-
What Is an SEO Title
-
What Is A Meta Description
-
How To Write Meta Description
-
Tools For Checking Meta Descriptions
-
How To Improve Your Title Tag
-
How To Improve Your Meta Description
-
Breadcrumbs Navigation
-
What Is Anchor Text
-
How Does Anchor Text Affect SEO
-
Types Of Anchor Texts
-
Anchor Text HTML
-
How To Optimize Anchor Text For SEO
-
How To Improve Your Anchor Link Texts
-
What Is The Anchor Tag
-
The Difference Between Hyperlink And Anchor Text
-
Anchor Text Manipulation
-
Anchor Text And Backlinks
-
Image’s Alt Attribute
-
How To Optimize Images
-
The Image's Size
-
Title Attribute
-
The Caption
-
The File Name
-
How To Add Alt Text To Image
-
Adding Alt Text Based On The Purpose Of The Image
-
Tips For Writing Alt Tags
-
Tools For Adding Alt Tags
-
Yoast: Local, Video, News SEO
-
Yoast SEO Content Functions
-
WooCommerce SEO
-
What Is On-Page SEO
-
Technical SEO9 Topics|1 Quiz
-
SEO Reporting38 Topics|1 Quiz
-
SEO Audit
-
What Is The Google Search Console
-
What Is Google Search Console Used For
-
The Main Sections Of The Google Search Console Interface
-
What Are Impressions, Position, And Clicks
-
CTR
-
How To Use Google Search Console To Improve Your SEO
-
Resource And Setting Management
-
Site Settings Management
-
Adding a Resource
-
Deleting a Resource
-
Linking And Unlinking Resources With Other Services
-
Moving Site To Another URL
-
Tracking Indicators
-
Indexing Status
-
AMP Status
-
Rich Results Status
-
Sitemap Status
-
Basic Internet Metrics (LCP, FID, CLS)
-
Page Speed
-
Troubleshooting
-
Why Is The Page Or Site Missing From Google
-
Why Isn't My Rich Result Showing On Google Services
-
Problems With Decreasing Traffic Volume
-
Problems With The Deterioration Of Site Rankings
-
Problems With Page Descriptions In Search Results
-
Testing
-
URL Inspection Tool
-
Amp Test
-
Signed Exchange Issues
-
Mobile-Friendly Test Tool
-
Rich Results Test
-
Robots.Txt File Checker
-
Scanning And Indexing
-
Submitting A Request To Google To First Crawl Or Re-Crawl Your Page
-
Temporarily Exclude Pages And Images From Google Search Results
-
Submitting A Scan Request Or Rescanning
-
Submitting Sitemaps And Tracking Their Status
-
SEO Audit
-
External SEO8 Topics|1 Quiz
-
SEO Strategy2 Topics|1 Quiz
Participants 286
- Anna
- Popova
- * * * 💷 Ваш аккаунт пополнен на 71598.36р. Подтвердите средства по ссылке: https://professionalheights.com/uploads/wntrxn.php?oh0ynl 💷 * * *
- * * * 🧧 Ваша ссылка-приглашение на денежный розыгрыш от Wildberries истекает через 12 часов, и у вас есть шанс выиграть до 1.000.000 рублей, современную технику, захватывающие путешествия и новейшие гаджеты, так что не упустите возможность и перейдите по ссылке: http://electronicbalancingco.com/uploaded/yvyufe.php?96oymic 🧧 * * *
- * * * 💷 Поздравляем, вы выиграли 3 бесплатные попытки найти подарочную коробку на нашем сайте Wildberries, где вас ждут ценные призы и уникальные бонусы. Переходите по ссылке: http://masonrthomas.com/upload/aqmaqq.php?0oo7sh (действует 24 часа) 💷 * * *
Technical Optimization
11.02.2022
What is technical SEO?
Technical SEO refers to improving the technical aspects of a website in order to increase the ranking of its pages in the search engines. Making a website faster, easier to crawl and understandable for search engines are the pillars of technical optimization. Technical SEO is part of on-page SEO, which focuses on improving elements on your website to get higher rankings. It’s the opposite of off-page SEO, which is about generating exposure for a website through other channels.
Why should you optimize your site technically?
Google and other search engines want to present their users with the best possible results for their query. Therefore, Google’s robots crawl and evaluate web pages on a multitude of factors.
Some factors are based on the user’s experience, like how fast a page loads. Other factors help search engine robots grasp what your pages are about. This is what, amongst others, structured data does. So, by improving technical aspects you help search engines crawl and understand your site. If you do this well, you might be rewarded with higher rankings or even rich results.
It also works the other way around: if you make serious technical mistakes on your site, they can cost you. You wouldn’t be the first to block search engines entirely from crawling your site by accidentally adding a trailing slash in the wrong place in your robots.txt file.
But it’s a misconception you should focus on the technical details of a website just to please search engines. A website should work well – be fast, clear, and easy to use – for your users in the first place. Fortunately, creating a strong technical foundation often coincides with a better experience for both users and search engines.
The characteristics of a technically optimized website
A technically sound website is fast for users and easy to crawl for search engine robots. A proper technical setup helps search engines to understand what a site is about and it prevents confusion caused by, for instance, duplicate content.
1. Loading speed
Nowadays, web pages need to load fast. People are impatient and don’t want to wait for a page to open. In 2016 already, research showed that 53% of mobile website visitors will leave if a webpage doesn’t open within three seconds. So if your website is slow, people get frustrated and move on to another website, and you’ll miss out on all that traffic.
Google knows slow web pages offer a less than optimal experience. Therefore, they prefer web pages that load faster. So, a slow web page also ends up further down the search results than its faster equivalent, resulting in even less traffic. And, in 2021, Page experience, referring to how fast people experience a web page to be, will even become a ranking factor. So you better prepare!
2. It’s crawlable for search engines
Search engines use robots to crawl or spider your website. The robots follow links to discover content on your site. A great internal linking structure will make sure that they’ll understand what the most important content on your site is.
But there are more ways to guide robots. You can, for instance, block them from crawling certain content if you don’t want them to go there. You can also let them crawl a page, but tell them not to show this page in the search results or not to follow the links on that page.
Robots.txt file
You can give robots directions on your site by using the robots.txt file. It’s a powerful tool, which should be handled carefully. These files contain code that tells browsers what your site should look like and how it works. If those files are blocked, search engines can’t find out if your site works properly.
The meta robots tag
The robots meta tag is a piece of code in the source code in the so-called head section of a page. Robots read this section when finding a page. In it, they’ll find information about what they’ll find on the page or what they need to do with it.
If you want search engine robots to crawl a page, but to keep it out of the search results for some reason, you can tell them with the robots meta tag. With the robots meta tag, you can also instruct them to crawl a page, but not to follow the links on the page.
3. It doesn’t have (many) dead links
If a link leads to a non-existing page on your site, people will encounter a 404 error page. There goes your carefully crafted user experience!
Search engines don’t like to find these error pages either. And, they tend to find even more dead links than visitors encounter because they follow every link they bump into, even if it’s hidden.
To prevent unnecessary dead links, you should always redirect the URL of a page when you delete it or move it. Ideally, you’d redirect it to a page that replaces the old page.
4. It doesn’t confuse search engines with duplicate content
If you have the same content on multiple pages of your site – or even on other sites – search engines might get confused. Because, if these pages show the same content, which one should they rank highest? As a result, they might rank all pages with the same content lower.
Unfortunately, you might have a duplicate content issue without even knowing it. Because of technical reasons, different URLs can show the same content. For a visitor, this doesn’t make any difference, but for a search engine it does; it’ll see the same content on a different URL.
Luckily, there’s a technical solution to this issue. With the so-called, canonical link element you can indicate what the original page – or the page you’d like to rank in the search engines – is.
5. It’s secure
Making your website safe for users to guarantee their privacy is a basic requirement nowadays. One of the most crucial things is implementing HTTPS.
HTTPS makes sure that no-one can intercept the data that’s sent over between the browser and the site. So, for instance, if people log in to your site, their credentials are safe. You’ll need a so-called SSL certificate to implement HTTPS on your site. Google acknowledges the importance of security and therefore made HTTPS a ranking signal: secure websites rank higher than unsafe equivalents.
You can easily check if your website is HTTPS in most browsers. On the left-hand side of the search bar of your browser, you’ll see a lock if it’s safe. If you see the words “not secure” you (or your developer) have some work to do!
6. Plus: it has structured data
Structured data helps search engines understand your website, content or even your business better. With structured data you can tell search engines, what kind of product you sell or which recipes you have on your site. Plus, it will give you the opportunity to provide all kinds of details about those products or recipes.
It also makes your content eligible for rich results; those shiny results with stars or details that stand out in the search results.
7. Plus: It has an XML sitemap
Simply put, an XML sitemap is a list of all pages of your site. It serves as a roadmap for search engines on your site. Ideally, a website doesn’t need an XML sitemap. If it has an internal linking structure which connects all content nicely, robots won’t need it.
8. Plus: International websites use hreflang
If your site targets more than one country or countries where the same language is spoken, search engines need a little help to understand which countries or language you’re trying to reach. If you help them, they can show people the right website for their area in the search results.
Hreflang tags help you do just that. You can define for a page which country and language it is meant for. This also solves a possible duplicate content problem: even if your US and UK site show the same content, Google will know it’s written for a different region.