AMP articles are getting a lot of traction these days. Bottom line for me: Google is trying to confiscate the Web with AMP, exactly like Facebook’s Instant articles.
A controversial view:
The reality is more complex—it turns out to be really difficult to build a non-trivial site that works in the worst cases (low end device and/or poorly connected device) but that also enhances up to a full, rich experience that meets consumer expectations. It’s simply easier to deliver different experience to device classes. Progressive enhancement (PE) is useful and is a good thing, but its powers are tricky to harness in reality at a wholesale level. This is a controversial thing to say, but if PE really was useful for wholesale adaptation, you would see more of it in use. To take one example, the search page served by Google would seem like an ideal candidate for PE techniques. But does Google use it? No, not really—they serve entirely different markup to each class of device. Why don’t AMP pages and Instant Articles enhance up to their full-featured equivalents? It’s just too hard to make it work.
Yes, true. Building scalable, progressively enhanced pages is hard. Too hard, I don’t know, your mileage may vary, but I see what the author means. Yet, should we just let Google and Facebook do the job, filter browsers and provide users with optimised versions of contents? From what I hear here and there, only a fraction of people who read these versions then go to the actual site who produced the content. Technically, readers are cut from the original source, they must look around to find the actual URL the content was drawn from.
Why should users care, you ask? Well, to be honest, they shouldn’t. But we as web people should. It’s already bad enough that URL shorteners have broken such a basic thing as links.
Some will fret about splitting the web and say that we have regressed, but on the other hand we now have some really fast mobile sites that reach more devices and lower-end devices than ever before. Could we have done this without AMP or Instant Articles? Yes, of course. But we wouldn’t have—and despite swathes of evidence pointing to the importance of page speed—we didn’t. Instead we got relentlessly heavier and slower.
So, because our community as a whole failed this, let’s give everything to the giants and go back to bed? Fortunately, by the same author on the same site, there’s also another music entirely:
So why the sense of foreboding? The issue is not the labelling of fast-loading pages. Rather, it is the approach of favouring a Google-approved method of achieving a fast-loading page. Other pages may load just as quickly but if they’re not built with AMP they won’t be called out. Lean sites that jump through hoops to maximise performance but that don’t use AMP will not get the preferential treatment. It would be far more equitable if reliably fast-loading sites were called out regardless of their underlying technology.
YES, DEFINITELY YES. Google uses its monopolistic situation to try and build a walled garden with the content that was created by others; to earn money from the work of others, in a caricature of crowdsourced content.
Maybe it’s time to remind Facebook and Google that AOL played the same kind of game 20 years ago, pretending the Web was non-existent. And look where AOL pages are now: they went the way of the Dodo. Interoperability is not just a word.
Also, from the W3C’s Technical Architecture Group:
While we understand the value these approaches provide, they also pose serious issues. Fundamentally, we think that it’s crucial to the web ecosystem for you to understand where content comes from and for the browser to protect you from harm. We are seriously concerned about publication strategies that undermine them.
Please read AMPersand by Ethan Marcotte and Google’s AMP HTML by Adrian Roselli, who are more vocal and talented than me to show why this AMP idea is flawed.