Why does Google favor big sites, well-known brands and large publishers over experts and niche sites?

Marketing, SEO

The current widely propagated narrative in SEO suggests that Google favors large publishers over small ones, well-known brands over smaller service providers. I’m not saying this isn’t true because we do observe such trends. However, the question is why this is happening. A corporate conspiracy? Let me speculate and propose a few theses. The basic one is that Google operates, in principle, justly. I understand this is an unpopular opinion, but so be it.

Please treat this article as an essay or feuilleton. I am addressing it to publishers, smaller producers, resellers, and SEO specialists.

To avoid misunderstandings:

An essay is, generally, a piece of writing that gives the author’s own argument, but the definition is vague, overlapping with those of a letter, a paper, an article, a pamphlet, and a short story. Essays have been sub-classified as formal and informal: formal essays are characterized by “serious purpose, dignity, logical organization, length,” whereas the informal essay is characterized by “the personal element (self-revelation, individual tastes and experiences, confidential manner), humor, graceful style, rambling structure, unconventionality or novelty of theme,” etc.

https://en.wikipedia.org/wiki/Essay

Expertise? Who needs it?

First, Google cannot assess content in terms of expertise (thematic coverage does not guarantee it). Second, one should ask why Google should focus on this expertise when most users cannot read (sic!) and are looking for easy-to-digest content that provides straightforward information without going into details.

Publishers, content creators, journalists, SEOs, and marketers live in a bubble of educated people who seek information, conduct research, and work with data. Most people do not operate this way! Assuming that a highly detailed article explaining nuances, citing experts, and using professional language is better for the user (statistically) is a cognitive bias. We transfer observations from our environment to the entire population of internet users.

Incidentally, in other situations, we easily emphasize that most people are idiots and we are exceptional, but that’s another story…

Expert vs. Expert

If this still doesn’t convince anyone, I’ll give another argument. Recall some debates among specialists in various journalistic programs arguing about economic, social, political, demographic, ecological issues… Each has an academic title or journalistic achievement, which in our internet content marketing culture makes them experts. They can present conflicting opinions. So which ones should Google favor? Who can decide that? Google’s algorithms don’t have the apparatus to determine which content is expert. That’s all on the subject.

Valuable content, what is it?

We observe a boom in AI-generated content. Sometimes it’s better, sometimes worse, but the fact remains that there’s a huge amount being created, and those are the facts.

Search engine algorithms include mechanisms to assess content quality. However, they primarily aim to filter out the worst, nonsensical, unreadable content. They are not mechanisms for identifying the best content.

In recent years, more polished content has won out, but if everyone uses tools like Surfer or NeuronWriter (grab the lifetime deal), everyone saturates their content with phrases, builds optimal subheadings, does all the reverse engineering, adds entities… then Google no longer has additional factors to rely on to decide which content is better. There are factors, but they are not strictly content-related, which I’ll get to shortly.

So why should content from your polished blog or small magazine written by enthusiasts (experts, even) be considered better by Google? What mechanism should be responsible for that? And would it indeed be better for users, most of whom are not interested in your musings and nuances? Or just for you?

And…

To be honest, there are many cases in SEO groups where publishers complain that their content is losing positions and that Google is taking away their traffic. The truth is, they are not publishing expert content.. They think using one or two optimization tools makes their content valuable. They believe that hiring a copywriter instead of using AI to write their content makes it expert-level. However, these are often poor-quality articles overloaded with ads. It’s no wonder they lose visibility. Sometimes, it’s worth reflecting on this.

Links

When tools for scaling content appeared, some specialists predicted the end of SEO because “AI will overwhelm us.” I didn’t fully understand this argument, and along with a few other colleagues, we quickly concluded that the inflation of content would increase the value of other factors.

Currently, the only scalable (allowing differentiation among competitors) factor building authority is links. I have no doubts about that. Of course, for links to work, a series of other criteria related to site trust, topical authority, and positive user reception must also be met (which I’ll get to shortly).

I hear complaints from some SEOs that they can’t fight for top positions with links. I hear voices saying it’s not worth it. But that doesn’t mean links don’t work; it means you’re buying too weak links, and your project doesn’t have the funds for better, usually more expensive, ones.

That’s why large brands grow with algorithm updates. That’s why big publishers win over small ones. They have a better link profile, more backlinks from seed pages, a larger scale of influence, and greater credibility. Large publishers themselves become seed pages.

Why doesn’t Google do anything about this? After all, it’s “safer” to refer to information from a large medium that is constantly monitored by other media, commentators, etc., than to a niche writer whose expertise is hard to verify.

I don’t want to get into a discussion about the state of the media, but in the context of the current level of information consumption, this solution seems sensible and safe from Google’s perspective.

When reading about a topic, users can choose between top media A or B, but they know both. If they had to choose a text by Jan Kowalski from “Jan Kowalski’s blog about {anything},” there’s a high risk the reader would think, “what nonsense is Google giving me here”??? Google’s goal is to deliver more than decent user experiences – satisfying sessions. That’s it. Their strategy is logical.

Behavioral signals

I mentioned earlier behavioral signals, i.e., factors related to user behavior. This is not a simple topic, especially in terms of measurements, but it must be addressed. Google has been able to assess user satisfaction with specific sites for a long time.

If Google sees that in a given topic, site X delivers satisfying content (compared to others), it assigns it a trust credit.. It predicts that this site, in this topic, is worth displaying high. Smaller sites, with fewer behavioral signals or negative signals (e.g., from laypeople who massively leave an expert blog because they don’t understand it), simply lose out.

If we focus on the short-term interest of Google’s user, it’s good that they lose, regardless of whether we like it personally or not. Often, I personally don’t like it either, but so what? 🙂 We put moral and philosophical considerations aside at this point. Let’s be professionals.

Brand is an asset

Market research regularly shows that people, in general, mostly prefer well-known brands. Why should it be different with Google? The search engine also collects certain signals that can confirm this. The websites of big brands have (in a simplified way) more good links, more content, and probably better user signals. Their sites are more frequently clicked, often have better UX, and are a preferred choice. They also build more touchpoints that legitimize them (ads, presence in various channels), and build habits, making them regularly chosen (including through direct traffic).

Drivers are more likely to choose well-known gas stations located right off the highway, where they have a points card, a car wash, and where they can buy a hot dog, rather than unknown stations 2 km away. It’s very similar with websites.

Therefore, Google makes a statistically justified (beneficial for the user) decision, suggesting its user use a well-known medium or large online store with a wide range and refined UX.

It has long been known that all markets in the long run tend toward consolidation, naturally leading to the emergence of a few leaders, with entry barriers becoming higher. The “internet” market is no exception. These changes have simply accelerated, and it’s becoming increasingly difficult for small players to compete for top positions. All mechanisms (“SEO tricks”) allowing to fight for traffic from Google are quite efficiently implemented by big businesses with capital.

Helpful Content System and hidden gems?

Google is aware that some users seek more expert content, deeper analyses, more detailed advice, etc. They signaled this by implementing the Helpful Content System and making changes in Core Update, which are supposed to allow finding so-called hidden gems in the future. If you are a specialist, a demanding user, perhaps there is hope for you that more personalized results will bring something for you.

A separate mechanism should be responsible for this, bypassing the main factors (e.g., a strong link profile). However, as we see, at this moment, it doesn’t work too well. This further emphasizes that it makes sense to focus on links. I understand the frustration of those who pour sweat and tears into creating so-called valuable content that ultimately isn’t rewarded by the algorithm. I really do. But looking at the whole statistically, it has to be this way. Perhaps the mass communication channel is not the right one for you? Premium publishing “brands” and expert brands either have to cooperate with giants or find other communication channels.

Including more niche ones, based on loyalty and exclusivity, such as closed groups or newsletters. Many experts have successfully been operating this way for years. Maybe it’s your time too?

Future of SGE

The truth is that we will only soon find out what the future will look like with the Search Generative Experience. Google will further support zero-click searches by replacing direct answers from publishers’ websites with its own AI-generated content. The question is, how long will this last if publishers decide to block AI bots from learning from their content? I don’t have the answer to that, but one thing is certain – for AI to remain relevant and up-to-date, it needs access to current content. And publishers need an incentive to publish it.

I recently spoke with a director from one of Poland’s largest portals (actually the entire media group). He said he sees no reason to allow OpenAI or Google to learn from their content, so they will block it. This deadlock cannot last forever, so I think we will see many twists and turns regarding AI results in SERPs. However, I am still convinced that larger media outlets will have an advantage over smaller ones – for many reasons mentioned in this article.

Summary

  • If you represent a business (per se – a producer, online store, etc.), keep in mind the growing capital barriers to entry, not just bad Google algorithms.
  • If you represent a publisher – act like the big ones, sell out, join a publishing group, cooperate with a big brand.
  • If you are an SEO specialist – aim for larger clients because that’s where the capital is moving, even in the SEO industry.

And a quick reminder for those who said that links are losing significance, especially in the context of the AI revolution. It is actually the AI revolution that has made proper links an even more crucial element of building authority.

Share this post:

    Let's talk about SEO!

    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.