Day 8 - On-Page SEO

Ryuzaki

お前はもう死んでいる
Moderator
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
6,244
Likes
13,129
Degree
9
1header.png


Today we are going to cover the entirety of on-page SEO. This is possibly the most important lesson in this entire guide. Do this right and you'll save hundreds of thousands of dollars in time, energy, and costs related to link and social signal acquisition.

Introduction

What They Say
When you're out and about on all of the blogs and forums, you'll hear a lot of conflicting advice about on-page factors...

2blah.gif

The worst advice you'll hear is that it doesn't matter at all. We've known "SEO's" who think that Google is done with on-page and it's all about links, but then can't figure out why after spending $10,000 on links to a particular page that it isn't ranking.

The second worst advice is from old blogs or old I.M.'s who haven't kept up with the playing field. They'll tell you to take every possible old school factor and include them three times each. The more the merrier! The problem with that is our friend called Panda and his ability to find over-optimized content.

Here's what you should remember... On-Page SEO is all about walking the razor's edge. If done right, you've already conquered over half of the battle of ranking for a specific set of terms. If under/over-optimized, you're not even in the races.

What We Say
...

3hulk.gif

The old stuff matters a lot.

These factors are the core of on-page SEO and exactly what's targeted as being over-optimized because any brainless monkey can do it. It's what will help you rank for the specific keyword you've chosen. The new stuff is what's going to support that ranking and bring in long-tail traffic that adds up to be more than the specific keyword you're going after.

These newer factors won't over-optimize you because they aren't based around a keyword. They are based around general factors that send "quality" signals to the search spiders. They are also your competitive edge. They can't be faked. There's no tricks or shortcuts. They take time and they take mental energy to compose properly.

So here's how we'll fast-track you on the topic of on-page SEO:
  • Explain the old factors and how to use them in the present without over-doing it.
  • Introduce the new factors and describe their use and purpose.
Let's get started.



4factors.png

The Old Factors

Let me state right off the bat... these are old but still very relevant. The key is to not go overboard. Which is to say, "No. You do not need to include all of these, and definitely not more than once."

This entire topic assumes you understand HTML, considering you're working online and that's the backbone to everything that's displayed in your browser. You have to be able to "mark up" your content with HTML tags in order to send these on-page signals. It only requires a rudimentary understanding of HTML, but if you think you're going to become a top dog in this game without knowing it like the back of your hand, you're wrong. Make sure you know what's what here!

The Good
This list is in order of importance and strength:
  • The Title Tag - <title>
  • Header Tags - <h1>, <h2>, <h3>, etc.
  • Keyword in URL
  • Image Alt Tags - <img alt="" />
  • Strong and Emphasis Tags - <b> or <strong> and <i> or <em>
  • The Meta Description - <meta name="description" content="">
Those are the ones that matter, with quickly decreasing value for you. If you want to play it safe, don't even worry about strong or emphasis tags. Just use them naturally. And you can avoid the meta description if you'd like. It takes time, should be unique, and Google overrides it 90% of the time anyways.

That makes your life easy! Believe it or not, only three of the old on-page SEO factors even matter these days. But they matter a lot. We'll explain how to use them here in a moment.

The Bad
Here are two that are absolutely, without a doubt, no longer relevant that you can ignore entirely:
  • Keyword Density
  • The Meta Keywords Tag - <meta name="keywords" content="">
Keyword density is out thanks to Latent Semantic Indexing (LSI). If you ever consider keyword stuffing anywhere, you might as well give up now. As far as density goes, just be natural and you'll be fine. Do not write the body of your content for robots, write for humans. The robots know the difference.

The Meta Keywords tag was from a time when search engines needed us to explicitly tell them which keywords we thought were important for our content because they couldn't sort it out. Now they can, and they know that you know this but still get greedy and out yourself as an SEO. It's basically like announcing to Google, "Hey guys, here's exactly what you shouldn't allow me to rank for."

5target.png

You want to make sure you never rank for your keywords? Use a 10%+ density and make sure you put it in the meta keywords tag. There's no quicker way to ensure Google that you're an SEO spammer.

How to Use the Old Factors
We've talked about keyword research. You understand that while you have a core short-tail keyword you're going for, that you have a handful of long-tail keywords that you'll take in the meantime that will support your ability to conquer the main keyword.

Important: If you use your main keyword in every single possible spot in the old factors, you're on your way to a sad fizzling death. On-page over-optimization ties in closely with off-page over-optimization. The more you do here, the less you can do there, and vice versa. So don't get greedy!
Let's look at each of the five remaining old factors that are worth your while and build some examples along the way.

Let's say you're optimizing for three terms (in order of importance), using the classic example of the widget:
  • Best Widgets (30,000 searches, buyer's intent!)
  • Blue Widgets (10,000 searches, informational intent)
  • Blue Widgets Reviews (3,000 searches, buyer's intent)
Your goal is to take those who know they want a widget, or even better know which color they want, and convince them that they specifically want one of the ones you've ranked as best so they purchase and you get a commission. The most valuable term to rank for is "Best Widgets". While "Widgets" has a ton of volume, you're not likely to take it down with this article, but any extra traffic you do get from it has the potential to be a lot and the potential to be converted.

Title Tag
Your main goal here is to include your short-tail keyword in your title tag. Some people claim that you get a boost if it's at the front of the tag. We say that you can't possibly write a good title if you're doing that all of the time. User interaction (click through ratings on the SERPs) matter as well, so don't sabotage yourself by being greedy in any one place.

6non.png

You should be able to include your short-tail keyword in your title, which is non-negotiable, while also sliding in a long-tail and some other variations. We'd shoot for something like this...

The Best Widgets are Blue - The Top 5 Reviewed!​

What we've done is ensure "Best Widgets" (our money maker) appears intact. Make sure this happens. We've mentioned the word "Blue" and also "Reviewed" as a variation of the word Reviews. All three of your terms are in the most important slot in some fashion without you mentioning the word "Widgets" more than once.

Sure, you're two supporting keywords aren't explicitly mentioned here. We'll take care of that elsewhere, which will help dilute your use of the main term while helping the search engines understand the context of the article. They aren't dumb and will totally understand what you should rank for. They are hoping you're dumb enough to cram it down their throat so they can penalize you. Follow this method and you're safe.

Header Tags
Generally we will make our H1 tag the same as our title tag. This is for the user. When you display the title tag to them in the SERPs, you create continuity when they land on your page and expect to see that same title displayed somewhere. However, if you want to get tricky, you can write a different version of the title in order to use different keywords. We don't recommend it, but we don't see it as bad. The user always comes first.

At this point in the game, because most of us use the Title inside the H1 tag, the H2 may be the most powerful header tag you have at your disposal.

7gun.png

This is your opportunity to help ensure your rankings for your second most important term. In this case, the 2nd and 3rd most important are a shorter and longer version of each other, so you can use both. Because they are less competitive in the SERPs, they need less emphasis in the on-page SEO and any slack will be made up by the LSI work being done and the links you'll obtain in order to rank for the main term. You'll take down all three terms with one article!

After your title and your introduction, your H2 tag might look something like...

What to Look for in Blue Widgets Reviews
Before you embark on the journey of sharing the top reviewed products, you can explain the structure of the reviews, which will allow you to expand your content and include LSI terms.

At this point, in our H3 tag, we could keep it simple and say: The Top 5 Reviewed Widgets. And then you could use H4's for the name of each widget product. This will dilute your usage of your specific key phrases and keywords within your header tags, while keeping them in the most important tags. Safe. And it digs you down to H4, which most articles online don't do. This means you have more depth and information, as far as a computer might be concerned.

Keyword in URL
When possible, include your keyword or keyphrase in your URL structure somehow. There are lots of ways to accomplish this based on the CMS you're using. If you're building a flat file site then you can set up whatever you want. Get it in there!

Image Alt Tags
When you place a picture in your article, you have to use the image tag. That's what's happening even if your visual editor is doing it for you. Some people are blind and use a text-to-speech converter. Sometimes the image fails to load. In both circumstances, the "alternate" option is to show or read the contents of the alt tag.

The purpose of this is to describe the picture in some sense. It's not exactly a caption, but a short phrase that gives the context of what is in the image. So perhaps you show the Acme Model 3 Blue Widget in an image. That'd be the perfect alt tag, considering it uses two of our words that help make up our main keywords. Remember, you don't have to fall prey to slamming in the exact keyword phrase you're hoping to rank for. Provide context and be natural:

8alt.jpg

You don't want to manipulate here. Google's machine learning is growing in the ability to visually understand the context of a picture beyond just the alt tags and titles of the image files. Safe guard yourself from future penalties. If you want to squeeze in your keywords, get creative. Create header images to go along with your header tags and include your keywords there.

Strong and Emphasis Tags
Don't be the person who straight up just bolds blue widget reviews for no reason other than to do it. You're begging Google to notice your tricks and doing it in the places that offer the least reward. Don't even bother with bolding and italics unless it's natural. Then creatively sneak in your phrase. Never bold or italicize your phrase only, but include it within the emphasized portion. For instance...

Remember as you read these reviews... another person's opinion about a specific set of blue widgets is just that, their opinion. Your mileage may vary. Make sure to consider which qualities are the most important to you before you make your decision.
If you're going to use these tags, just keep it natural. Technically, while they offer little power, they could be the tie-breaker between you and your competitor. But that's never really going to happen due to the hundreds of other ranking signals on and off-page. It's one of those easy wins that could come back to bite you if you don't keep it natural.

Meta Description
At this point, the meta description is only a suggestion to Google about what they should display in the SERPs under your title. They have a lot of data and more often than not will pluck portions out of your content to display. They do this in order to show the keyword or phrase that the user has searched. So that lets you know that having your keyword in your description matters, if not as a ranking signal than only as a means to increase your click through rating, which is a positive ranking signal. So do it if you have time.

9buso.png

It should be unique and 150 to 160 characters. This number fluctuates as Google tests different lengths, but aim for that and it will be good enough for you to not have to revisit. Your main goal here is to include one of your keywords while enticing your potential visitor to click your link in the SERPs instead of someone elses.

So think of it like Clickbait. Don't give away the full story. Make them want the rest. They'll know the only way to get it is to click.


Take a Breather...

With just the above information, you'll spank most of your nonsense competitors. I'd venture to say that this would take you 30% of the way alone, without any links or other on-page factors. But thankfully there's still plenty of advanced on-page optimization we can do that is guaranteed to boost our rankings and present no danger of being squashed by the Panda.

10panda.gif

As you wrapped up the previous section, you may have been thinking... "There sure is a lot of emphasis here on playing it safe. I'm ready to go hard and make this money, not pussy-foot around!" You're right, but not for the reasons you're thinking. If you're intention is to build a long-term asset, then definitely play it safe above. Below is where you get to go hard.

The New Factors

The old factors are entirely about optimizing around a set of key phrases. The new factors are about optimizing around the concepts of quality, authority, and trust. Yes, they are abstract. Yes, nobody can measure them and most struggle to define them. That's because the game is now being played on the next level and the average advice-slinger isn't there yet.

The old factors can be considered global in the sense that they apply in every niche for every page. They can be considered hierarchical in the sense that all you have to do is beat the person directly below you and attempt to beat the person above you. It was a game of leap-frog.

The new factors are where trust, authority, and quality come in, and the reason everyone struggles to understand them is that they are still thinking about leap-frog. They are thinking in terms of variables in isolation on one SERP. That's simply incorrect these days.

These abstract variables are not global, but local. They refer not only to your specific page and those other 9 on page one of that specific SERP, but with your entire domain being measured against those of your niche competitors. The way to reduce spammer effectiveness was to stop measuring individual pages so heavily and to begin to weigh factors that can't be easily manipulated.

Think of things such as the age of the domain, the number of indexed pages, emphasis on these same factors on the domain where your backlink is coming from, topical relation of the page a link is coming from, etc. These are much harder to fake, but still possible.

But the one set of factors that are impossible to fake are what we are about to discuss. You can easily take advantage of them if you're willing to put in the time and energy up front. Yes, it requires attention to detail and all of the other draining mental activities. But they also make your overall job 100% easier in the long-run. You plan on being around for the long-run, right?

11lab.png

Here are the new factors on on-page SEO that you should be including:
  • Your Grade Level Reading Score
  • Latent Semantic Analysis/Indexing (LSA/LSI)
  • Term frequency - Inverse document frequency (tf*IDF)
  • Supplementary Content (sidebar, footer, header)
  • Page Speed Time
  • Length of Content
  • Enhanced Content Types
Let's break these down individually and talk about how to apply them.

Grade Level Reading Score
There are several algorithms being used for this, all offering similar results. The Flesch Kincaide Reading Ease score might assign you a number value between one and ten, while the Grade Level scores will assign you a value between Kindergarten and Post-Doctorage complexity.

12kid.jpg

Your task is to understand where the top 10 pages ranking for your terms in your niche are falling on these scores. You can literally test them and find out, or you can just read them and get a feel for their writing style. The factors that affect this score are the average number of words per sentence, number of sentences per paragraph, number of letters per word, amount of punctuation being used, grammar, syntax, rarity of synonyms, etc. You get the point.

If you're writing about coloring books, there's zero reason it should measure at the Master's Degree level. And your discussion on the nature of consciousness and it's role in the human condition shouldn't be 3rd grade level either.

This is all fairly obvious once it's pointed out. The number of errors in your content will also affect this score and your quality, trust, and authority as well. Find out what Google prefers for your vertical and go with it. This is going to be based on what the readership prefers, so it's a win-win for you.

Latent Semantic Analysis
This is why you don't need to do keyword stuffing or worry about keyword density any more. We started with dictionaries, and then created a relationship between dictionary words with thesauruses. The next step has emerged, which goes beyond the word or even sentence level. It deals with overall pieces of content.

The smart machine will have noticed after analyzing its incredibly large index, "I notice that 30% of the time that when people talk about Widgets that they end up mentioning Gadgets too. There's a relationship there. But it's not as strong as the relationship to the concept of Thingamabobs, which is gets mentioned 80% of the time Widgets are mentioned."

13ai.png

Your task is to determine what phrases are tangentially related to your main key phrases and keywords but are not your keywords. You could bot this, which has been done with very interesting results. However, the best way to achieve this is to actually be involved in your niche or hire someone who is, and it will all happen naturally. You could get super scientific on it, but I think you're better off getting most of the way and then start working on your next piece of content. If you're even thinking about LSA, you're going to beat 95% of your competitors in terms of on-page SEO.

Term frequency - Inverse document frequency
This was the keyword stuffing killer. This is what stopped all of the idiots who were still writing for robots dead in their tracks. This is also what you want to watch out for when you hire SEO writers, because they think that means "mention the keyword as many times as possible."

14tf.gif

tf*IDF is basically a measure of how important a specific keyword or key phrase is to a piece of content. You could roll back through this guide and see which words are being used the most. I'd venture a guess... the, if, and, or, also, is, was, where, as...

That's why the word "Inverse" appears in tf*IDF. The amount of times a word is used is inversely proportional to its importance to the document. So basically, if you use your keyword too many times, or any word appearing within your key phrase, you're crossing a threshold where you begin to insist that it's less important.

The reason is is that in reality in the way we actually write and speak, once we state a noun or a name of a product, we stop referring to it as the noun or name and we start using words like "it, he, she, her, that." It's mathematically sound, and which is why we use LSA terms to fill in the gaps and we save our actual key terms for the important old factors.

Supplementary Content
Google has hired humans for millions of man-hours to rate supplementary content. They are feeding this data to their machines so that the machine learns to mimic the human's perception. So all that stuff you cram in your footer and sidebar needs to be useful to some degree.

Good ideas include stating your physical address, a phone number, links to your contact and about pages, displaying the most popular or related posts to the one being read, links to other related websites, an email list sign-up box, etc. Think about the things a human would deem useful and that's what the machine thinks too.

Bad ideas would include all advertising space, sitewide links to other websites, randomized content or links, any crawlable words unrelated to the topic at hand, etc.

15main.png

In this example above, everything outside of the red rectangle is supplemental content. How well do you think this type of website would rank these days?

Page Speed Time
While we do have an entire day dedicated to page speed, don't jump ahead. Just understand the concept. The slower your webpage loads, the less useful it is to humans. They will abandon it, press back before it finishes loading, and all other ridiculous instant-gratification behavior. That's just how it is and search engines know this. So they measure how long it takes your page to load. The faster, the better.

We've personally made a page load faster and seen a rise in the main keyword's ranking and seen an increase of long-tail traffic. The same goes for making your entire domain load faster. It's a very good thing and shouldn't be taken lightly.

16speed.png

You don't need to hit the 300ms level and likely won't if you have a magazine style website, and especially if you include any 3rd party ads such as Adsense. Realistically, if you can keep it around 1 second, you're doing fine. You'll learn a boatload of easy ways to improve your page speed and some harder ways that'll ensure you have an advantage over your competitors in a future day. Don't sweat it just yet.

Length of Content
Sure, this is easy to fake. However, for it to be unique and not full of errors plus including the other new factors listed above and below is impossible to fake. There is a direct correlation (notice I said correlation, not causation) between the length of your content and how high it ranks. There was a time when 300 words cut it. 100 words can still cut it depending on the terms. But if you really want to take down an important term, 2,500 words is what you should be aiming for. My last post for a huge term was over 13,000 words. Guess how it's performing?

The reason this correlates highly with ranking is two-fold. User's tend to link to and share longer content loaded down with pictures and the advanced content types we are about to discuss. The second reason is the reason why they link and share it. They believe it's more authoritative and of a higher quality. They trust it more. The search engines know this so they go ahead and give you a boost for it. Then the user's interact with it better and you get a second boost. Then they link to it and share it and you get a third boost. Get those boosts.

Enhanced Content Types
Here's the real winner, saved for last. If you include these enhanced types of content, you won't struggle to reach 2,500 or even 10,000 words.

Let's just talk about these conversationally. If I miss any, or your niche uses some that I don't mention, you'll be able to spot them immediately when you see them in the wild. Add them to your list!

17food.jpg

That looks delicious, huh? You know what wouldn't? If I just slapped a can of green beans down in front of you and handed you a can opener. That's what you're doing when people land on your site and are confronted by a giant wall of text. Offer them the buffet and they'll eat their fill and probably come back the next night.

The obvious "extras" you should be tossing in are lists, whether ordered or unordered. Videos are a great way to boost your time on-site per user. Images aren't even a question, but you can take it to the next level with an infographic within the article. The goal with all of this is to keep the user engaged and scrolling till they hit your call-to-action and do whatever you want them to, because they are having so much fun and learning so much.

Another one that ensnares so many people are quizzes. They just have to answer those questions and see where they measure against others or find out what type of Harry Potter character they are.

Is your article really long with lots of sub-sections like this one you're reading now? Consider adding a wikipedia style table of contents at the top, where a list of all of the headers are clickable and jump the user directly to that section. Google understands these and even includes them in the SERPs. Don't forget your "Back to the Top" links!

Consider adding a Resources section that lists other related articles on your own website and on others together in one spot so your page acts as an information hub.

With the advent of hero images, this is mattering less and less, but watch what is above the fold on your site. The fold is obviously of different sizes these days and with mobile and tablets its just absurd. At least attempt to make sure content appears above any advertisements. This is a ranking factor.

And domain wide, does your site feature contact information and pages? Does it feature an About page that is substantial and includes data that Google might scrape and consider you an actual entity? Do you link out to your social network pages (with the appropriate schema mark-up, which we talk about in a future day in this guide)?

To reiterate, you should include enhanced content types that keeps your user engaged, entertained, and on your site for longer:
  • Lists
  • Videos
  • Images & Infographics
  • Quizzes
  • Table of Contents
  • Resource Sections
  • Above the Fold Considerations
  • Domain-Wide Entity and Trust Factors
Include all of these and you'll have no problem climbing to the top once you start your link building and promotion.


Conclusion for the Day...

I posted this here on BuSo the other day...

18rank.png

Yes, I hit as high as #14 for a 22,200 exact match search term with zero links and only the on-page SEO advice I'm giving you here, for free. Do you realize how much money and time you'd save on links and outreach if you'll apply the info here? I'm giving you the keys to the kingdom, but you still have to walk up to the door and unlock it. Sadly, many will gloss over this information. Don't let that be you.

Let's let all this sink in for a minute. Don't worry about memorizing all of this info. It's here for you any time. Remember, this is the rudimentary and advanced on-page stuff you should be doing without question. We also mentioned Schema Markup and Page Speed Optimization that you can explore later. These will help bring your entire domain's on-page factors together and boost each of your individual pages along the way.
 
Well, this article was awesome, I definitely need to come back and read it again in order to grasp these new principles more tightly. Thanks!

Other than that, I would like to hear your opinion on interlinking and siloing, in particular of MNS (10-20 articles big). A lot of people says that there is no need to make silo structure on small sites that are based around one topic.

To my understandings, if I have a site for "German shepherds" structured in 5 categories with 4 articles each, talking about the history of the breed, different sub-breeds, specific details about the breed and so forth, I don't need to make silo structure. Then, if I expand my dog site with another 60 articles discussing 3 more dog breeds, only then it would be suitable to start to silo the site around the main 4 topics. In general is my understanding correct?

About the interlinking: do you have any principles, like for an example: minimum of 3 interlinks per 1k words, or you do it only when you think it's natural and relevant. Something like in one article you link 1 time and in another you can do it 7 times. If yes, do you make sure to link to your categories or to the homepage in every single article?
 
@MeEatBrains

On Silo's
I agree. I wouldn't bother siloing a tiny site. It's essentially a single silo as it is.

If you started a large site called "EveryDog.net" (made that up, could be a real site, not checking), but started with German Shepherds only... I'd be thinking about future expansion in my build. I'd go ahead and create a main page for German Shepherds. I'd give a briefer summary about the history of the breed, the behavior characteristics, grooming needs, exercise needs, aggressiveness, whatever you'd write about. But after each of those H2's, I'd link to my own longer article about the topic. Then in each of those topics, I'd link back to the main topic.

Now you're creating a German Shepherd silo that cycles link juice to the main hub and back out through the spokes, and back through. We know that page rank flows with a dampening factor... probably around 70-75%, meaning if 100 "juices" went out of your silo head to a single sub-article, the sub-article would receive 70 juices. But it'd point those juices back to the silo head, sending back 49 more juices (70% of 70). So the real two questions become "How many iterations does Google calculate of these juice cycles in a small loop or wheel, and do they have an internal fix to reduce the extra gain from the loop?" Mathematically, the dampening factor is the fix, but I could imagine them adding in another limiter. The point is that this silo system not only helps with content relevancy but boosts itself by really squeezing the juice out of each link.

If this doesn't rank the main page and sub-pages, gain more links and create sub-sub-pages that create more linking opportunities. At this depth, you should be able to make some nice click bait or fun content types like the ones in the giant orange image on Day 7.

And yes, after this, I'd expand right into the next silo for Cocker Spaniels, Greyhounds, Huskeys, etc. Same set up. I think I'd make the silo's based around the breed, not around "history" and "behavior", for instance, based on what I think people are searching for. They search for "Breed history" or "history of a breed," versus "history of all breeds" and then narrow down. They jump right to it.

On Interlinking
You described exactly what I do. I link out when it makes sense, outbound and internally. A lot of times never do until the site begins to grow horizontally in content. Then when I publish an article based on a new topic, I run a search to find old linking opportunities in past articles.

I never force an outbound link or an inbound link unless I'm purposefully writing supporting content. If I want to rank for "German Shepherd Mixes" and have a page about that, I might purposefully write content around topics like "The Goofiest Breed Mixes You've Ever Seen" just so I can link back to that article. That's like mini-silo structure in content relevancy, even if they are in totally separate sections of the site. I'm not huge on silo's personally. My preference is to link all over the place internally to make sure juice is flowing everywhere as much as possible. I want all of my ships to rise with the juice tide. But when I create those internal links, it's most important that they are contextually relevant. After enough leaps, it can stay relevant and transform into almost any other topic in the vertical.

No, I almost never link to my category heads or back to my homepage. Homepage's these days generally only rank for their own Brand Terms, and my category pages just exist for humans for the most part. Of course, the homepage gets a ton of links and the juice flows to the category pages and then into the silo's or other content. From there, I want it staying down in the content level, not above the silo's. Homepage's are going to rank for your Brand regardless, and not much else. So I don't focus on that. I do expand my category pages to include some content just so they aren't thin. They still feature posts rolling below that static content though. I don't try to rank them though, that's just personal preference.

I hope I've dealt with your questions accurately. Thanks for asking.
 
At this point, the meta description is only a suggestion to Google about what they should display in the SERPs under your title.

@Ryuzaki Saw a tag that is new to me in my header that was inserted by YoastSEO:
<meta name="robots" content="noodp " />
After researching, I see this tag is supposed to ensure the search engines use your meta description.

Do you know if this tag is relevant or if it works?
 
Actually, Noodp refers to "No Open Directory Project" aka DMOZ. Many moons ago, Google would use information listed there as your meta description. Same goes with Yahoo's Directory, which led to a Noydir tag.

Both are severely outdated now, stopped being relevant around 2005 or so. Yahoo Directory doesn't even exist any more. DMOZ might as well not exist at this point.

I've not seen anything that can force Google to display the meta description you've explicitly requested. If you rank for a one-time search you didn't optimize for, they are probably going to show a random snippet with that keyword or a synonym and bold it.

My guess is this have the data that shows this level of consistency (such as making sure the text in your PPC ad appears on your lander, etc.) short-circuits the thinking brain and produces more clicks than trying to persuade someone who has the opportunity to become skeptical and logical. It's a form of copy writing, taking advantage of heuristics like these.
 
First of all, wonderful post @Ryuzaki. I need to think out loud and pick your brain on a few things relating to long-tail keywords.

I have been immersed in social and haven't thought a whole lot about "SEO on purpose" for quite a while, and you mention that now there's more weight placed on the authority of the domain itself - which is a huge plus in my situation. Through social alone, I've ranked for some surprising terms - so I figure with minimal tweaking, I should be able to gobble up long tails without much more than a strong on-page strategy like you've shared here.

When I was ranking thinner sites for really targeted long tails, I always saw sites like Wikihow, Walmart, Amazon, as signs that this was a SERP I could take over because those sites had countless pages of unoptimized content, weren't specifically targeting those keywords.

But now I'm thinking, with the strength of the domain playing more of a role, and I'd imagine Amazon et al know about on-page SEO and have done a better job of optimizing their pages, are these types of pages in the serps still a sign that it's easy territory to crawl into? It makes sense that sites like Amazon or Walmart are going to rank for long-tail product terms with buyer intent, versus a site that just talks about the product, right? If someone wants to "Buy blue widgets online", the more deserving #1 result would be a place where they can buy blue widgets online, instead of a site that has 1000 words about buying blue widgets online?

Basically wondering if a SERP that used to look weak because of Zon, Wally, Wiki, etc.. is actually going to be tougher to break into these days, since not being over-optimized and having a powerful domain is what used to get them there in the past, and now those are even more favourable traits for a page to have?
 
@Potatoe ,
I used to be a big fan of Market Samurai because it made the talk you're asking about much easier. They have a module called SEO Competition. You could pin your own page to the top, and then have it pull the data for your page plus the top 10 ranking URLs. It looks like this:

19market.png


I didn't reinstall this software on my new computer so I had to yank that image from their site.

The point is that you can create a matrix of data like above and get a good idea of whether the big eCommerce sites are ranking because of:
  • Generalized domain authority (with no or few links to an unoptimized page)
  • or because the pages are optimized well in addition to being on the bomb-ass domain
My answer is that both of these scenarios still occur, but you've rightly pointed out an important factor. Google understands the intent of search phrases and ranks types of pages from types of sites for those accordingly.

You can force your way into an eCommerce SERP with a content-based review page, and that content with all of the optimization and enhance content types above will definitely help you. But at the same time, I'd make sure I gave Google what it was expecting on that page first and foremost, as high up the page as I could. This includes things like prices, technical specifications, numerical reviews in categories, review statements by people, etc. Most of the eCommerce pages are going to have next to no actual content. You can outweigh them here for sure.

Before I actually consider going after something like this, I'm going to look at two things: their on-page optimization first. Market Samurai is slipping in usefulness for this as synonyms and variants have begun gaining more weight. I'm going to make a quick pass at the top 5 and see if their on-page sucks or not. If they suck, I may look at the top 10 too just to get a better gauge, but top 5 is where the traffic explosion happens, with the top 3 and then #1 being exponentially even higher. If I can't get in there... I may publish a page that I didn't work as hard on just for completions sake if I'm at that stage of scaling.

If the SERP passes the poor on-page test, then I'm going to look at those pages specific backlinks, not sitewide but that page itself. If you're looking at 10 or less links to most pages, see what kind they are. They may not even be contextual. If you see several up in there with zero links, in addition to poor on-page across the top 10, I'd go for it.

To summarize:
  • High DA + Poor On-Page + Zero to Ten Links = Go Time
  • High DA + Poor On-Page + Lots of Links = Probably will Move On
  • High DA + Good On-Page + Lots of Links = Escape Now
As your site also becomes one of those High DA monstrosities this scale might shift a little.

For instance, I once went for a scenario like this where I was up against all .Govs and .Edus trying to rank for the name of a career position. It was a Go Time scenario, but I didn't expect to pull it off with only a single page about the topic, which is how they did it. I made an entire site around the topic with the homepage being an EMD (exact match domain). I ranked #1 for it, but it took a lot of effort, obviously.

Yes, I think it's possible without building entire sites, using just one very well done page, especially when we are talking about eCommerce versus government and educational competition. But if my domain wasn't already pumping, I'd probably publish the content, set up a rank tracker, and forget about those pages and wait for a notification from SerpWoo, or I'd just wait and focus on other topics and keep building links and social signals until I felt I could click publish and 3 days later pop right into the top 20.
 
Actually, Noodp refers to "No Open Directory Project" aka DMOZ. Many moons ago, Google would use information listed there as your meta description. Same goes with Yahoo's Directory, which led to a Noydir tag.

Both are severely outdated now, stopped being relevant around 2005 or so. Yahoo Directory doesn't even exist any more. DMOZ might as well not exist at this point.

I've not seen anything that can force Google to display the meta description you've explicitly requested. If you rank for a one-time search you didn't optimize for, they are probably going to show a random snippet with that keyword or a synonym and bold it.

My guess is this have the data that shows this level of consistency (such as making sure the text in your PPC ad appears on your lander, etc.) short-circuits the thinking brain and produces more clicks than trying to persuade someone who has the opportunity to become skeptical and logical. It's a form of copy writing, taking advantage of heuristics like these.

This is funny. Last year I had a random guy call me. He couldn't figure out why Google was showing the description, which was different from he had in his homepage HTML. This was the first time I had ever seen something like this, so upon some research, I realized Google was still pulling from his DMOZ listings. The guy's site was from 1999 and wasn't changed in years, so it made sense. Added Noodp tag, and the issue was fixed within a couple of weeks. Shit, that reminds me, that guy still owes me $100. lol.

Great post by the way. Always good to have a refresher.
 
@Ryuzaki What are your thoughts on more than one H1 tag? I've done some research and some people say its okay and others say its blasphemy. I'm working on some pretty long articles and sometimes it makes sense in my mind that for a 2,000 word article to have two or three H1.

Maybe I just need to get more comfortable with H4 tags lol.
 
I wouldn't do it simply because it breaks from the philosophy of headers which are intended to be nested.

Just like in collegiate essays, scientific reports, and novels, you only use an H2 if you've already used an H1 first and it only comes after. H3's are only to be used after an H2. You know this but I type it for the benefit of anyone who is unsure.

An example format would be:
  • H1 - title
    • H2
      • H3
      • H3
    • H2
      • H3
        • H4
        • H4
        • H4
      • H3
        • H4
        • H4
    • H2 - conclusion
If you feel you need more than one H1, then you really need a separate article or you should be using H2's in which you can go ahead and use another.

There is an instance where using a horizontal rule (<hr> tag) could semantically suggest a clean enough separation to warrant another H1, but even when I do that I still continue on with an H2.

I think of an H1 as being meant to encapsulate the entire article.

In terms of SEO impacts, I couldn't tell you. I suspect Google encounters weird heading issues all of the time. Not a day goes by where I don't see some random theme developer screwing it up and passing it out to thousands of people. Newbies who tweak their sites will often use the wrong header because they like the way it looks according to the CSS.

My suspicion is that it won't have much of a negative impact if any, but it definitely won't provide a positive impact.
 
@Ryuzaki scenario: You're writing an article titled Top 50 Vacation Locations for Lonely SEOs. That's a lot of locations. You're going to write 50-100 words about each location with a picture, link, and a header for each location. Would you use a H2 for every single location on the list? One H1 and fifty H2s? I'm asking because I've seen quite a few big sites use H3s for this type of list, even if it means skipping H2 all together. How would you structure this article and would you make any use of <ol>?
 
I wouldn't straight list 50 locations. That'd be boring. I'd find a way to subdivide them. And those subdividers would form the H2's. The reason is is that I could create better optimization by emphasizing fewer H2's, and then letting there be a ton of H3's, which will feature less of an on-page impact. So it might look something like this, including how I'd use ordered lists:

_____

H1 - Top 50 Vacation Locations for Lonely SEOs

There's a whole ton of spots around the globe for neckbeards to relax... especially where your native language isn't spoken that well. Now you can meet people without the anxiety of feeling like you'll say something wrong. Let's make our way around the globe...

H2 - The Best Vacation Spots for IMer's in The Levant

The Middle East has a lot to offer an internet marketer, especially if you bundle India into the mix as some do, which is silly. Choose the right location and you can get as much done as normal for pennies on the dollar while maxing and relaxing.

H3 - Saudi Arabia
[Picture]​
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Aenean varius velit id eros hendrerit vestibulum ut eget libero.

H3 - India

[Picture]
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Aenean varius velit id eros hendrerit vestibulum ut eget libero.

H3 - Turkey

[Picture]
Blah blah, here's a list of the top 5 cities in Turkey for the digital nomad:
  1. Istanbul
  2. was Constantinople
  3. now it's Istanbul
  4. not Constantinople
  5. it's nobody's business but the Turks

______

Then I'd hit up the next H2 and go through the Occident, Orthodox, Orient, etc. There's all kinds of data you could cram in those sections that you could use lists for, tables, etc.

I don't ever recommend skipping a header depth. They are meant to be nested. If you see someone doing that, it's often because they like the CSS of the H3 better than H2 and don't know how to change it and don't realize that it can matter. Can you get away with that stuff? Sure, at your own peril.

With 50 H2's, you might as well have zero H2's, which means you've lost a major chance at some free organic traffic for the lifetime of the article.
 
If I wanted to do, for example, an article with the best computer mice, which of these two options would be the best?

1: A great article that collects all the information on all types of mice.
2: An article with the best cheap mice, another with the best mice for left-handed, another with the best wireless mice etc ...
 
@Vert, I'd say both at the same time are the path.

I create giant review articles with information sections, sub-sections like you mention in your 2nd point, and more. These are for the largest keywords with the most competition. It helps you bring in all the various LSI synonyms and phrases and shows Google that your article is the most all-encompassing and for that alone deserves to rank.

These big articles can often take down the smaller keywords you've mentioned, but you'll find yourself losing them over time as thirstier SEO's come in and go for the low-hanging fruit. In that case, it would take you very little time, relatively, to go ahead and produce separate articles for these too. You'll lock your ranking in due to the hyper-optimization around the key phrase itself. Then you can interlink back to the main money maker and help support it with juice and relevancy.

You don't necessarily need to tackle all of these smaller keywords at first, but it does help. You might argue that you should wait until someone encroaches on your territory, but I'd also warn to not underestimate the long-tail traffic you'll get from these additional articles.

Now you're talking about variations like "left hand" and "wireless" and "trackball"... Those I would definitely create content for. I don't necessarily recommend chasing each and every variation like "Best Mouse Under $xxx". They're nearly infinite and you can tend to rank for them if you create "Budget" sections in the big article with the proper headers. Yes, they have great buying intent but you do end up rewriting the same content over and over with contradictions between each other.

I wouldn't bog down there when you can expand horizontally to the next product. But I would tackle variations on features that have respectable volume, as well as mention them in a much shorter fashion in your main review for the big terms.
 
Can short and temporary content (as a news story) negatively affect the SEO of a site?

Logic makes me think not, but being content with so little "useful time", makes me think that it could negatively and globally affect a website.
 
Can short and temporary content (as a news story) negatively affect the SEO of a site?

Logic makes me think not, but being content with so little "useful time", makes me think that it could negatively and globally affect a website.

It depends. Panda, and the previous Farmer update, were designed to combat these approaches. In years past, Matt Cutts, and recently, Gary Illyes, have both commented on this topic here and there. When all of the puzzle pieces come together it looks like this:

Are you adding value?​
They've specifically mentioned concepts such as combining RSS feeds. If you pull paragraphs from 5 different feeds and combine them onto one page, even though you've "remixed" the content, they still know it's not unique. If you add widgets, you're not adding value, you're adding widgets.

But they've also said things such as... adding a paragraph of commentary at the start can add benefit for the user. Using extended quotes is not seen negatively. Even images can add "value."

To circle back to your question, because you weren't explicitly talking about copy and paste content, the answer is yes and no. If the content is temporary, you may delete it before it has an opportunity to harm you. If Google finds you're creating work for them in terms of indexing and then un-indexing, then probably it's harmful, especially if it can be shown to be short. I would never publish anything less than 300 words unless it was a contact page. I personally try not to go less than 400-500 unless I've added a substantial number of images as well.

If you plan on removing the content, then I would suggest assigning a no-index meta tag to each piece or placing them in a sub-folder with a no-index imperative in the robots.txt file.

I would suggest considering what your aim is. If it's strictly SEO then I wouldn't take any "temporary" route. If it's strictly viral traffic or content intended to be leaked to, then I'd no-index my whole site and copy and paste till the cows come home. If it's both, then I'd consider either no-indexing a sub-directory or making sure each piece adds substantial value.

In order to add value, if large portions are quotes, then I'd add a paragraph of commentary above and below, at the top and bottom of each post. I'd add images, perhaps embed a YouTube video, and even add some of the witty YouTube comments with more commentary on them (or re-write them and pretend you came up with it).

If your goal is to gain traction and create an SEO snowball, then temporary is a bad word, and short can be okay but the minimal effort it takes to make it less short is worthwhile. It's easy to remix and rewrite from a lot of different sources before publishing and only add a few minutes to the workflow on each article.
 
When all of the puzzle pieces come together it looks like this:

Are you adding value?​
This is what worries me when posting news. It's a content that can bring value to the user when the content is fresh, but as soon as the days pass, it's no longer news and therefore, I think it's no longer useful.

On my main website I've about 200 posts with this type of content (technology news) and all are indexed in Google. Google is my main source of traffic.

Deindexing all this content at once I think it could be more harmful than beneficial. I'm wrong?

What I think is to stop publishing such content to offer only more elaborate and timeless content, because I don't see an easy and simple solution for this.

Thanks for your help.
 
On Topical Optimization

I've been talking about this quite a bit on the forum recently and feel like I've been repeating myself, which means it's time to add this to the Crash Course so I can just point people to the post.

Google has moved beyond keyword optimization and into the realm of topical optimization. But their problem is they still have to return results based on keywords. So where does that leave us and how do we take advantage of this?

I recommend that you stop thinking of optimizing for a single keyword (one word or several as a phrase). If you're doing low volume keyword sniping, going for one keyword is fine. But if you're working on monster posts of 5,000 words and tons of enhanced content types, you might as well maximize your returns.

Instead of optimizing for one keyword, find the parent keyword related to your specific keyword and make that your "main keyword." This is usually the most generalized in intent, usually an information keyword but they can be buying keywords as well. Then identify all of it's child keywords.
The question then becomes, "How do I know the child keywords are?" The three tools I like for this, in order of effectiveness are Google's Keyword Planner, SERPwoo, and Ahrefs. Keyword Planner will literally show you the specific groups while SERPwoo shows you related terms you should include that you will rank for automatically once you take down the others, and Ahrefs does something similar.

For example:
  • dog beds
  • puppy beds
  • doggie beds
  • beds for dogs
  • dog bedding
  • dog cushions
  • dog mattresses
And all kinds of variants on those using words like 'canine' for example... where the top item is the parent keyword. This is a quick example just to illustrate and not perfect by any means.

What's happening here is that instead of focusing only on the parent keyword, you'll reduce your usages of the parent to perhaps just the Title Tag and the URL Slug, and maybe an H2 and one instance of using it in the body and an alt tag.

Then, all of the other places, like other alt tags, H2's, and H3's, you'd use these child keywords. So your density for the parent topic is nearly nonexistent but your density for the topic is where you'd generally expect to land back in the day.

Now you're optimized for the topic at large, using the parent in the most important spots and the children in the other effective on-page areas. Then you go back and make sure you have all of you LSI terms about dogs and beds in general in there. If you want to you can do a TF*IDF analysis on your percentages for the parents and each keyword and adjust to match the top 10 results.

The result here is that you're optimized for the parent keyword but you've also illustrated to Google that you've explored the topic to a depth that your competitor's likely haven't, and you're also semi-optimized for the children. As this page gets some age and authority, one of two things happens:
  1. You start to rank for the parent and ultimately take down all of the children
  2. You start to rank for the children and then that qualifies you to rank for the parent
Now, at the end of the day you may never get any of the terms you optimized for. The topic may be too difficult, but you will, without a doubt start ranking for countless long tails you never considered. While they may not be worth it individually and you'd never attempt to take them down, you'll get a ton of them by optimizing for the entire parent/child basket. And because the pages that may be ranking for the parent and children are doing it by sheer power of the domain, they're leaving those long-tails open for you to consume.

Here's a few examples at random I grabbed from a site of mine:

20topical.png

That last result is particularly interesting. I grabbed that one last because I know it ranks #1 for the parent keyword. These are Ahrefs results. I just checked my Analytics and that page alone brought in over 5,000 in organic traffic, with zero backlinks.

That's because on-page is the foundation on which everything else depends. If I were to get that page even 2 or 3 solid contextual links and waited 3 months, the number of organic keywords would double at least.

The game is fast changing with RankBrain and Machine Learning in place. No, you don't need to go back and alter all of your old content. But going forward on big posts, I recommend doing this and possibly re-optimizing old informational content. If you have money posts that are currently ranking and banking, I don't suggest disturbing them because you will see a negative bounce in rankings at first that can last weeks. If you have money posts that aren't yet performing and you're willing to extend that period time in order to get better results, then you may consider re-optimizing them.
 
On Proximity, Prominence, Prevalence, & Density

Here's something old that's in play and something new that I think is important and I failed to mention before. This is all about keyword placement, whether that's in specific on-page locations or within the context of the article at large. Let's start with the old stuff.

Proximity
/präkˈsimədē/ - nearness in space, time, or relationship

You've probably heard of this. We want to exploit the concept of proximity, which suggests that there's certain spots even within our on-page optimization spots that Google may place more emphasis on.

The only two places in which I really worry about this is the title tag / H1 header and the H2 headers. Let me give you a couple examples so it'll make sense. Our keyword here is "aggressive dog breeds"
  • 8 Aggressive Dog Breeds Apartment Complexes Have Banned -
  • Pitbulls and Chihuahuas are Among the Two Most Aggressive Dog Breeds - x
I'm suggesting the first example is a far better way of writing your title tags and headers. Why? Because the keyphrase we're optimizing for is closer to the front of the title tag or header. It's proximity towards the front of this "optimization spot" suggests that it is the important part of the topic. The second one suggests that pitbulls and taco bell dogs are the important point of the article.

The main point I'm trying to make is that you should shove your keyword as close to the front of the title tag and header as possible. It should be the main noun and subject of the sentence, not a descriptor of some other main subject.

Prominence
/ˈprämənəns/ - the condition of standing out from something by being noticeable

This is very similar to proximity, except instead of dealing with having the important spot in an "optimization spot", we're dealing with it in the context of the article at large. What does this mean in normal people speak?

We can jump right to the main point here. Good - Make sure your keyphrase is in the first and last paragraph of your article. Better - Make sure it's both in the first 100 characters and last 100 characters of the article. Best - Make it the first words and the final words.

Realistically you can't always ham fist it in as the first and last words. The "Better" option works just fine, since this isn't the most important on-page factor in existence. But the more we can stack the deck in our favor the better.

Prevalence
/ˈprev(ə)ləns/ - the condition of being common

In our mission to make sure some machine language natural language processing robot understands what we want it to understand our main keyword is, we need to make sure it's prevalent and common. You have to play it safe here, because as you use all of these tips in this post, you can easily tip over into over-optimization in the keyword density arena.

To explain what I mean by this word 'prevalence' let me explain what not to do. Let's assume you've decided you have a 8 total uses of your keyphrase (not counting image alt tags and meta description) in your 1,500 word article. You're already burning up one in the H1 header, one in the H2 header at the top, one in the H2 header at the bottom, and 2 in the first and last paragraphs. That gets you to 5 already, so you have 3 left.

These are numbers I'm making up, by the way, just to illustrate this point. So what should you do with your 3 remaining uses of your keyphrase?

Say you cram them all in the first 300 words of the article. That means you have about 1,000 words of content in the middle of the article that never once mentions your keyword. Sure, it mentions related terms and entities, etc. But it doesn't feature your keyword. Would you assume that that keyword is the most important topic of the article? Yes, you would, due to the other optimizations you've done, but you wouldn't consider to be as strong as you might have if you ran into it throughout the article.

So basically, what I'm suggesting is that after you optimize your article with the "must have" keyword usage spots, you should take your remaining uses and spread them out evenly throughout the article.

In this way, Google won't assume only certain parts of the article are relevant to the keyword and others aren't. It'll understand that this topic is a central theme to the entire article.

Density
/ˈdensədē/ - the quantity of things in a given space

I haven't spoken enough about density. We all know not to do keyword stuffing now. Back in the day, like the late 90's, you could make sure your keyword density was 10% or even 15% and shoot to the top.

Nowadays, I shoot very low. Usually I'm hovering above 1% and never over 3%. An obvious tip to stay this low is to refer to your keyword, which might be "4-slice toaster", using words like "this, these, it, those, our favorite appliance, this device, etc."

By the time you tackle all of the main optimization spots, you're in the sweet spot for a 1,000 word article. If you're publishing shorter articles you'll be pushing it, and you'll have a harder time dialing in the more advanced methods in this thread anyways.

So yeah, dodge keyword stuffing. You can get away with higher amounts these days since everyone else isn't keyword stuffing any more, but it's not necessarily going to help you. But don't freak out if you're at 3%, it's okay.

Here's some other things not to worry about. Let's say you have a 1,500 word article optimized around the keyword "how to grow tomatoes". Let's say you use that exact phrase 7 times. Perfection. Then you search for the term "tomatoes" and you find you've used it 34 times! Don't catch the vapors, it's fine.

You weren't going to rank for the word "tomatoes" for this article anyways. I'm not suggesting you purposefully do this, but you can hyper-optimize for a general entity like "tomatoes" while actually optimizing for the longer phrase "how to grow tomatoes" and really shove it down Google's throat what the article is about. Hidden in this paragraph is a huge link building secret I discovered about 7 years ago to rank very well for high volume and high competition terms that still works today. It's pretty obvious and not complex, but hard to stumble upon. That's your reward for reading all this crap I've written to the very end.

So the summary here is don't stuff your entire keyword phrase, but if it's based around a main noun like "how to grow tomatoes", don't freak out about typing "tomatoes" 30 times in 1,500 words, as an example. It's fine.
 
On NERD - Named Entity Recognition & Disambiguation

I wrote about this in the Exp 5 Day of the Crash Course, but I figured it should be here too. This is cutting edge to most SEO's and I'm going to guess that most readers of this post haven't heard of it.

It has to do with Natural Language Processing (NLP) and how Google is attempting to understand exactly what the topic and intent of an article is, while getting a grip on the depth and breadth (aka quality) of the article.

Every word in this NERD phrase matters: Named Entity Recognition & Disambiguation. Let's break it down:
  • Named - there's a proper name and noun for this topic
  • Entity - there's information about this topic that makes it unique and gives it abstract form
  • Recognition - by understanding this information, the entity can be recognized accurately
  • Disambiguation - it can be recognized as separate and unique from similarly named concepts
First and foremost, where in the holy hell would Google get this information? This would be a monumental task. The first place to start would be a typical encyclopedia but the problem is that there's no web of connections between the named entities in these encyclopedias.

So who has encyclopedic knowledge but also has created a web of interconnections so we can understand which named entities are related to each other? Wikipedia. Who has actually collated this into usable data you can download and use? Wikimedia.

This is a monumental task, one only fit to be created through crowdsourced efforts. That's what Wikipedia is, so I'm suggesting that Google probably uses Wikimedia's database. And so can you, very easily.

-----

Now that we've established what this is and why it's being used, let me give real examples of what problem exists and how NERD solves it.

Take these two example sentences:
  • Paris is the capital of France.
  • Jaguars are the fastest feline.
A "dumb" robot would read the first sentence and maybe understand that Paris is the subject of the sentence. But it would scratch it's head and say "Are we talking about Paris the city or Paris Hilton the celebrity?" The same goes for the 2nd sentence. "Are we talking about the cat or the car?"

The problem is there's more than one named entity for Paris as is there for Jaguar. So the part that has to occur is the Disambiguation. The way NLP is able to tell the difference is the other information in the sentence (or article), such as "capital" and "France". Okay, we can search the database and see that there's a specific entity called Paris that is a capital city and is located in France.

Check out this confusion though. There's TWO entities that are Jaguars that are the "fastest". Both a cat and a car. We've gotten nowhere, but thankfully there's one more word in that sentence: "feline." Boom, we can now disambiguate between the two entities to realize we are, in fact, talking about the cat.

-----

Of course, all of this should be happening naturally as you talk about your topic. The important thing to do is to not get "cheeky" as our friends across the pond would say. Don't use analogies and jokes in your content. You'll confuse the robots. I hate when I hire a writer and they start referencing movies and TV shows and crap to be relatable. Don't do it. It should be obvious why at this point.

The "named entity" is your main keyword, basically. "Disambiguation" is going to happen naturally for the most part. The opportunity before us is to drive home the "recognition" part, which not only further disambiguates but also shows Google that you have depth and breadth (quality) going on.

Again, this data is likely coming directly from the Wikimedia database which is coming directly from Wikipedia's articles. And they make the job very easy for us:

7nerd.png

This is the opening paragraph about NERD. You see all those bolded phrases? Those are related entities. Talk about those, and Google will get a better idea what your topic is about and that you're going into it with depth.

Now scroll down to the bottom:

8also.png

More related entities to the main named entity that is the topic of the article. Mentioning these phrases (which are entities, as in they have a Wikipedia page, basically), means your content is connecting all the dots in the web of "entity recognition". You're re-establishing these relationships in your article and removing all confusion about what you're actually talking about (in the mind of the robot).

This isn't the final piece of the puzzle though. Entities are only recognized as such for two reasons: they are "named" and there's "recognition". They're recognized not only by this web of interconnectivity with other entities, but because there's information attached to them (the info that's in the Wikipedia articles, as an example).

This information is factual, and confidence that it's correct is astronomically high at this point. So including it shows you have depth, breadth, and accuracy, which means you can be trusted to appear at the top of the SERPs.

Did you know Paris is the capital of France? It's also the most populated city in the country of France. In fact, it has 2,148,271 people living within its borders as of 2020. It takes up a surface area on the earth of 105 square kilometers, which is 41 square miles. It had a GDP of $808 billion in 2017. The Paris Metro opened in 1900. This subway system serves 5.23 million passengers daily.

That right there is straight up NERD / NLP food. It's factual information that can disambiguate the named entity and reinforces recognition and is written in simple short sentences that an NLP bot can understand.

This is the atomic bomb of on-page SEO. I shouldn't even have shared it. Do everything in this thread exactly as I've typed it, and you've maxed out on-page SEO. Very few people will be going to these lengths. All that's left is links and traffic to dominate the SERPs.

BRB, I have to go to my weekly session at Outers Anonymous.
 
@Ryuzaki This post is awesome, thanks so much! Quick question for you, I noticed in the beginning that you mentioned, "Generally we will make our H1 tag the same as our title tag."

For the longest time, I've been using either Yoast or SEOPress.org, making sure that each main KW is at the beginning, sometimes too frequently I'm guessing and keeping the H1 pretty KW focused and identical to the URL.

For example, using your blue widgets example, I might use the following structure if I'm writing a review on them:

Title: Blue Widgets Review: 5 Best Worth Buying In [Year]

H1: Blue Widgets Review

URL: /blue-widgets-review

I guess my question is, should not waste my time making H1 and Title different from one another? Have you tested having H1 and Title the same vs different and if you had to choose which route to take, what would you suggest?

Thanks again for the write-up! Loving this Crash Course content.
 
I guess my question is, should not waste my time making H1 and Title different from one another? Have you tested having H1 and Title the same vs different and if you had to choose which route to take, what would you suggest?
I would suggest there's nothing to gain, SEO wise, by taking the time to make the H1 different from the Meta Title if you've written a good Meta Title. I do think there's something to gain in terms of user expectation and creating a consistent and continuity-fulfilling funnel though.

For instance, if the user clicks on your link in the SERPs that says "Blue Widgets Review: 5 Best Worth Buying In [Year]" and then lands on the page and the title of the page (H1 in our terminology) says the exact same thing, there's continuity and little psychological friction. If they land on the page and the page title now says just "Blue Widgets Review" that can be a minor cause for concern, cause them to wonder if they're where they thought they wanted to be, hit the back button, etc.

But as far as SEO goes, having your key phrase at the start of a longer H1 header versus having only the key phrase in an exact match manner is likely insignificant. It might even give you less wiggle room in terms of optimization in other places. I can't say for sure but I'd suspect there are a lot of little red flags that can be raised that "greedy" SEO's do that normies don't. This could be one of them. Other examples would be exact match domains, over-optimized exact match anchor text profiles, keyword stuffing, etc.
 
@Ryuzaki this makes complete sense to me now. I never looked at it like that, nor did I take the continuity into consideration. Also, it actually makes life easier only have to think of 1 really good H1/title versus both. Thanks I’ll give that a whirl!
 
No, I almost never link to my category heads or back to my homepage. Homepage's these days generally only rank for their own Brand Terms, and my category pages just exist for humans for the most part. Of course, the homepage gets a ton of links and the juice flows to the category pages and then into the silo's or other content. From there, I want it staying down in the content level, not above the silo's.
Regarding the inner linking. I'm noticing that I've done what one may call "extremely aggressive" with linking back to the homepage. What do I mean by that? Well, let's just go ahead and say that the links back to the homepage stick out like a sore thumb to me now. I'm wondering if it's better to slowly dial this back or leave them as is? I have even done this on the money pages and now when I think about it, it really makes no sense to internally link back to the homepage, especially if magazine style and not unique content on oen page with long form content. Curious what I should do here? Any suggestions?

I'm also noticing, based on my efforts here with help from you @Ryuzaki that I"m now looking at everything and noticing that I've got TONS of dofollow inner links with "Read Review" from the tables that I've created. They are on TONS of pages. I'm wondering if this is a really bad thing too? Perhaps it's looked at as being over optimized even though it's not a KW I'm trying to rank for? Not sure here really.
 
I'm wondering if it's better to slowly dial this back or leave them as is?
I don't think linking back to the homepage a ton of times is harmful, but it's perhaps an inefficient use of page rank. Remember that every link has a dampening factor that doesn't flow all of the page rank through to the other side. Let's say it's 85%.

Of the potential 85% of the page rank you could have sent straight to another article, you sent it to the homepage instead and now it has to make another 85% hop back down to an inner page. That drops you to 72.3% to an inner page instead of 85%. Let's say it has to jump through a category page first and then an inner page. That's 61.4% instead 85%.

Since you're bleeding 15% (again, a guesstimate number) out into the aether for each hop, you might as well send it straight to an inner page, which are the ones that rank for keywords these days. Your homepage is going to rank for it's brand terms even with zero page rank flowing through it.

The way I see it, for content sites and eCommerce sites (but not necessarily local business sites), the homepage should collect links and juice and spread it down throughout the site. But it shouldn't be cycled back up to the homepage (except through supplemental links like a logo link and maybe a footer link, which is just good user experience).

and noticing that I've got TONS of dofollow inner links with "Read Review" from the tables that I've created. They are on TONS of pages. I'm wondering if this is a really bad thing too? Perhaps it's looked at as being over optimized even though it's not a KW I'm trying to rank for? Not sure here really.
I'd venture to say that you're over-thinking on this part. The page rank is flowing to money making pages, and you're using a generic anchor text. Google is likely not going to consider those pages to rank for "read review" or any of the words therein.

They probably don't take generic anchors into account when determining relevancy at all, but you still get the benefit of the page rank flowing. And maybe it gives you a buffer in the case of hitting those pages with an external exact match anchor. I think you're okay here, though. These links sound like they're useful links for visitors looking to navigate to product-specific reviews. It's a good thing.
 
Back