Don’t worry if you’re having trouble understanding our SEO beginner’s tutorial. SEO jargon may be as difficult to grasp as a new language. The SEO glossary is structured by chapter and includes definitions and links to further reading. This webpage may be worth bookmarking.

Measure, prioritize, and implement SEO

An API allows developers to access features or data from another service, such as an operating system or app.

Bounce rate is the percentage of site visitors who didn’t take action on the website. A bounced session occurs when someone visits your home page and then immediately leaves.

Channels include organic search and social networking.

CTR is the ratio of URL impressions to clicks.

Conversion rate is a visits-to-conversions ratio. How many website visitors fill out forms, call, subscribe, etc.?

A qualified lead is a phone call or form submission from a website visitor. “Qualified” leads are likely to become clients.

What are your Google Analytics goals? Google Analytics helps evaluate your conversion rate by setting goals.

Google Tag Manager organizes website tracking codes.

Googlebot / Bingbot: Google and Bing’s web crawlers.

Pages per session, or “page depth,” is the average number of pages customers read in a single session.

Initial contentful/meaningful paint and time to interaction affect page speed.

In SEO, pruning is removing low-quality pages to enhance site quality.

Scroll depth tracks how far down visitors scroll.

Scrum board: A way to monitor tasks for a larger aim.

Search traffic refers to website visits from Google.

Time on page: How long a visitor stays on your website before leaving. Bounced sessions have a time on page of 0 since Google Analytics estimates time based on user clicks.

UTM code: Add a UTM code at the end of your URL to track click source, medium, and campaign.

Links and authority

Rand Fishkin coined the phrase “10x content” to denote “10 times better” web content.

“Amplification” refers to extending your brand’s message through social media, sponsored commercials, and influencer marketing.

DA: Domain Authority (DA) forecasts a domain’s ranking. It’s ideal as a comparison statistic (comparing a website’s DA to its competitors’).

A search engine deindexes a URL, a set of URLs, or a domain. This might happen when a website is penalized for violating Google’s quality rules.

In local SEO, a “directory” is a list of local companies that contains the NAP and occasionally the website. Low-quality directory or bookmark links breach Google’s standards.

Editorial links are gained naturally and supplied by an author’s choice, not because they were paid or compelled.

“Follow” links convey PageRank.

Google Analytics is a free tool website owners may use to track user behavior (they can pay to get more features). Google Analytics can show you where your visitors are coming from and how often they achieve goals (like filling out a form).

Google search operators are text you may add to your query to narrow your results. By adding “site:” to a domain name, you may see many of its indexed pages.

Guest blogging entails presenting an article (or an idea for an article) to a publication in the hopes that they’ll publish it and link back to your site. Beware. Google’s quality criteria prohibit keyword-stuffed anchor text in large-scale guest blogging efforts.

Link building: The phrase “building” makes it sound like you’re making links to your site, but it’s actually getting other sites to link to yours so search engines believe it’s more essential.

Link exchange is a technique to get people to link to each other. Too many link exchanges are negative, according to Google.

Moz’s Link Explorer finds and analyzes links.

A link profile is all the links to a domain, subdomain, or URL.

Full or partial company contact details on a non-directory platform (like online news, blogs, best-of lists, etc.)

MozBar is a Chrome add-on that displays a page’s DA, PA, title tag, and more.

NoFollow links don’t convey PageRank. Google sometimes supports purchased links.

PA: Like DA, PA predicts a page’s search ranking.

Paid links: Links purchased with money or another good. A paid link is an ad and should include a nofollow tag to prevent PageRank.

“Qualified” traffic suggests the visitor is likely to find the page’s content beneficial and make a purchase.

Referral traffic is from another site. Google Analytics lists traffic from Facebook as “facebook.com/referral” in the Source/Medium report.

Resource websites sometimes connect to other sites, therefore they include a list of helpful links. If your firm provides email marketing software, you may hunt online marketing resources and ask the owners to connect to your site.

Your brand’s reputation.

Spam Score is a Moz indicator that leverages flags associated to penalized sites to evaluate a domain’s punishment risk.

Unnatural links are “links on a page that were not put there by the site owner or vouched for by them,” according to Google. This violates their guidelines and might lead to a website penalty.

Search engines crawl, index, and rank.

Traditional SERPs contained ten blue links leading to ten identical organic results.

Black hat SEO techniques exceed Google’s quality standards.

Search engines locate your pages through crawling.

“De-index” means to remove a page from Google’s index.

Featured snippets are organic results at the top of specific SERPs.

Google My Business lists local businesses for free.

Some SERPs contain scrollable picture galleries.

Indexing organizes crawled data.

“Intent” describes a user’s search engine optimization (SEO) aims.

KPIs are numerical figures that show how well an action is doing.

Search engines return a “local pack” of three suitable local companies for “oil change near me” requests.

“Organic” search results don’t pay for placement.

Some search engine results pages offer a “People Also Ask” section with FAQs and answers.

The search box contains your query.

Relevance rating sorts search results by how well they answer a question.

Search engines analyze databases for user-requested info. Google, Bing, Yahoo!

SERPs have non-standard results.

SERP is a search engine’s results page.

Traffic is website visits.

URLs are digital addresses of web pages and other resources.

Google and Bing give “webmaster guidelines” to help webmasters create readily discoverable, indexed, and ranked content.

“White hat” SEO tactics don’t break Google’s regulations.

2xx status codes indicate a successful page request.

A class of error status codes for page requests.

5xx status codes: Server cannot complete request.

Advanced search operators are special characters and commands used to improve a search.

Algorithms organize and retrieve data.

Backlinks are links from other websites to yours.

“Crawlers” or “spiders” search the Internet for content.

Website cached.

Google’s web indexer is Caffeine. Caffeine is an internet index, whereas Googlebot actively finds content.

Citations are web-based references to a business’s name, address, and phone number (NAP).

Cloaking involves showing search engines and humans alternate content.

A search engine bot’s average page count on your site.

Crawler directives tell the crawler what to crawl and index.

Distance refers to proximity, the searcher’s location, and/or the query location.

Engagement: How searchers connect with your site from SERP.

Google Quality Rules: Guidelines from Google defining detrimental and/or search-engine-ranking-influencing tactics.

Google Search Console lets site owners monitor their site’s search performance.

HTML creates web pages.

Google Search Console’s Index Coverage report shows the indexing status of your site’s pages.

Index: A huge database of everything search engine spiders find worthy of presenting to searchers.

Internal links lead to other sites on the same domain.

JavaScript adds dynamic functionality to static websites.

Websites that need login to view content.

Manual penalty: A “Manual Action” by Google in which a human reviewer decides certain of your pages break Google’s quality guidelines.

Meta robots tags inform crawlers how to index a website’s content.

Navigation: Links that let people browse between pages. They usually show in a list at the top of your website, in a side column, or at the bottom (“footer navigation”).

NoIndex tag tells search engines not to index a page.

Google’s PageRank is a key component. It’s a link analysis software that analyzes a website’s relevancy by analyzing incoming links.

Personalization alters a user’s search results depending on location and search history.

Local pack significance refers to well-known and popular businesses.

RankBrain promotes the most relevant and valuable results when altering ranks.

Relevance refers to how well a local business matches the searcher’s goal.

Robots.txt tells search engines which parts of your website to crawl.

Search forms: Website features that help users find pages.

Guidelines for Google employees who rate web page quality.

A sitemap is a list of URLs that crawlers use to index your site’s content.

Spammy tactics violate search engine quality regulations, like “black hat” approaches.

URL folders are website sections separated by slashes (“/”). Moz.com/blog/blog is a folder.

URL parameters are added after a question mark to change page content (active parameter) or track information (passive parameter).

X-robots-tag directs crawlers how to index web page content.

Search Keywords

Ambiguous purpose is a search term that needs clarification.

Commercial inquiries compare goods to discover the best one.

Informational enquiries seek answers to questions.

Moz’s Keyword Difficulty score evaluates how hard it is to rank higher than competitors.

Moz’s Keyword Explorer helps you research keywords.

Local queries include “coffee shops around me” and “gyms in Brooklyn.”

Long-tail keywords are three-word searches. Because of their length, long-tail inquiries are more precise.

Navigational searches can send you to the Moz blog (query = “Moz blog”).

Regional keywords are only utilized in one region. Google Trends can tell you if “pop” or “soda” is more popular in Kansas.

Search engine keyword frequency. Keyword research tools estimate monthly searches.

Keyword popularity fluctuates seasonally. The week before Halloween, “Halloween costumes” are trendy.

Seed keywords explain your product or service.

Transactional inquiries are used to buy things. If keyword types were ordered that way, transactional queries would come last.

Website SEO

HTML images have alt text.

Text linking.

Programmatically-made content.

Duplicate content: Domain- or page-wide material.

Location or service area modifiers. But “Seattle pizza” is geo-modified.

HTML headers identify page headers.

Compressing photos speeds up websites without compromising quality.

An image-only sitemap.

Overusing keywords is spammy.

How readily visitors or crawlers may locate a link.

Link equity is a link’s worth or authority.

Link count.

Structured data markup helps search engines read a website’s data.

Content-describing HTML elements. Google occasionally uses snippet descriptions.

Panda penalized low-quality material.

before your domain. Controls server-to-browser data transfer.

URL relocation. Permanent redirections are common (301 redirect).

Rel=canonical informs Google which page is original.

Unauthorized scraping of website content.

SSL certificates encrypt web server-browser data.

Thin content: Low-value content.

Thumbnails are little images.

Optimization Techniques

Accelerated mobile pages, or AMP, are designed to make mobile viewing as fast as possible.

Async stands for “asynchronous” and refers to a web browser’s ability to skip waiting for one operation to finish before moving on to the next.

A web browser, like Chrome or Firefox, lets users access web-based information. When you submit a browser request, you tell it to retrieve the resources needed to render a page (for example, “google.com”).

“Bundling” combines many resources into one.

ccTLD stands for country code top-level domain. “.ru” is Russia’s ccTLD.

Client-side rendering and server-side rendering relate to rendering on the client’s computer. The client’s browser executes the file client-side. The server renders the files, and the browser receives them. “Server-side rendering” describes this.

The crucial rendering route is the set of steps a web browser takes to transform markup languages like HTML, CSS, and JavaScript into a visible online page.

CSS determines how a website looks to users (ex: fonts and colors).

Domain Name Servers (DNS) connect domain names (like “moz.com”) to IP addresses (such as “127.0.0.1”). DNS translates domain names into IP addresses, allowing web browsers to load website resources.

DOM is the HTML document’s structure. This model explains how JavaScript may read and modify a document.

Domain name registrars monitor and handle domain name reservations. GoDaddy.

Faceted navigation allows users to sort and filter hundreds or millions of URLs to get the one they want. E-commerce websites employ faceted navigation. Faceted navigation provides possibilities. You may sort a page of things by price, from low to high, or apply a filter to show only small sizes.

The Google Search Console’s Fetch and Render function lets users see a web page as Google does.

Encoding with fewer bits to reduce file size. Data compression compresses files. Compression techniques vary.

Hreflang informs Google the language of the content. This allows Google to give people the version of your website in their language.

IP addresses are unique to each website. Web addresses are IPs. IP addresses are what the internet uses to find websites, not domain names (like “moz.com”). Domain names are assigned IP addresses.

JSON-LD is a format for structuring data. Google suggests utilizing JSON-LD instead than other formats to implement schema.org.

“Lazy loading” delays loading an object until it’s needed to boost page performance.

“Minification” is deleting as many unnecessary characters from source code as feasible without harming its operation. Minification eliminates elements, unlike compression.

Google began switching websites to mobile-first indexing in 2018. Google will scan and index your sites depending on how they look on mobile devices, not desktops.

Using pagination, a website owner may divide a page into numerous ordered parts, like book pages. This helps sites with plenty of material. Paginated pages use rel=”next” and rel=”prev” tags to indicate page order. These tags tell Google that the sites require unified link qualities and that searches should be routed to the first page.

Programming is creating computer-understandable instructions. JavaScript may provide dynamic features to a web page, rather than static ones.

A web browser transforms a website’s code into a viewable page.

Render-blocking scripts prevent your browser from displaying a page. These scripts require page retrieval before display. Scripts that slow page rendering may force your browser to perform extra round trips.

Google encourages responsive design for mobile-friendly websites. This design style lets a website adapt to any device.

Google and other search engines provide rich snippets of URLs on their results pages. Excerpts are rich tidbits. “Rich” snippets are upgraded versions of ordinary snippets. Structured data markup can encourage rich snippets. Review markup appears as stars next to select URLs in search results.

Schema.org is the coding that “wraps around” your website to provide search engines extra information. Schema.org data is “structured” contrasted to “unstructured” data, meaning it’s organized.

SRCSET works like responsive design for images, displaying the appropriate version based on context.

“Structured data” is orderly information (as opposed to unorganized). Schema.org helps you organize your data by labelling it with search engine-friendly metadata.

HostRooster is a leading web hosting solutions company. Since our founding in 2019, HostRooster has continually innovated new ways to deliver on our mission: to empower people to fully harness the web. Based in London, England, we provide comprehensive tools to users throughout the world so anyone, novice or pro, can get on the web and thrive with our web hosting packages.