The underappreciated reason Google is useless
You don’t need me to tell you that search engines are struggling to be relevant. Often the blame falls on search engine optimization and Google’s business practices. Both contribute, no doubt. But Google — and the rest of us, actually — depended heavily on the volunteer attitude of the young web and its dedication to open access.
Once upon a time (and maybe still today, I am no expert), link quality was a critical part of Google’s PageRank algorithm. If your site did not have a lot of links to it, then you didn’t get high rankings. The thinking was that more links meant higher reliability, and that was effective for a little while.
If what you measure has economic value, the measurements will be manipulated. This is always true, whether it’s work hours, tax deductions, or employee performance. Judging a site’s worth by its incoming links worked for a time, but once people realized back-links were valuable you started getting a lot of link farms and spam-bots building links to low-quality content.
SEO folks, both white-hat and black-hat, focused on doing things to the content that boosted PageRank score without necessarily creating quality content. In response, Google kept coming up with more elaborate ways to sidestep the manipulation.
Around 2007 though, as social media sites like Facebook and Twitter became popular, people who used to make useful, entertaining, or quirky web sites started moving more and more of their work into these platforms. The economic and social benefit of having your own web site declined. The value now was being Twitter, Facebook, or YouTube famous. The DIY spirit of the web was largely captured, centralized, and redirected to the economic benefit of just a handful of places.
I am as guilty of making this shift as just about anyone else.
So as garbage generation was accelerating, authentic creation was captured and monetized by an increasingly smaller number of platforms. Not just Twitter and Facebook, either, but places like Reddit and Stack Overflow. Independent publishers were not just crowded out of PageRank by SEO practitioners, people just stopped making things outside of these platforms.
Even some of the places that used to be quality content ended up getting sold to or commandeered by operators far more interested in advertising revenue than the material people were being attracted to the web site to see. Sites that remained content-focused started hiding more and more stuff behind paywalls and registration links. The Verge has recently opted for the former and 404 Media the latter have both recently elected for the latter.
What search engines are looking for? Increasingly, it’s just not there to find.
And now many of us who remain publishing independently are blocking crawlers in an attempt to prevent our work being used to train the artificial intelligences other humans are using to make automatic garbage content at an unimaginable rate.
Where does that leave us?
Despite it all, I think it’s important for people who have the inclination and ability to start making their own content again. Maybe that content is a blog like this one. Maybe it’s a fan site for an obscure Australian 70s children’s show. Maybe it’s hand-curating links.
The important bit is to stop posting that stuff on the Facebook machine or Blue Sky and absolutely stop having anything to do with X née Twitter.
The web is still open. Anyone who wants to learn how can publish on it, for free or very close to it. We should take advantage of it. I don’t think there will ever be a circumstance again where most of the material on the web is being made by people who care about what they are doing more than they care about monetizing traffic.
But we can keep at least part of the web weird. Let the bots go talk to each other.