Categories
Online Marketing

Indexing your PWA (Discoverability & SEO) – Progressive Web App Training

Every search engine has a different way of ranking pages, but they all depend on a web crawler to gather information, and when you build a JavaScript driven site, the crawler might not be able to find everything you might need to give it a little help.

While every search engine has its own way of crawling, there are two fairly obvious rules. First, if the crawler can’t see it, it’s not going to be indexed and everything needs its own URL. There may be a trivial solution for your site if customers always search for a landing page or other static content, but those pages be static content. This won’t index client rendered content, but that may be exactly what you want.

This does raise an interesting distinction. A PWA does not have to be a single page app, you could add a serviceworker do every page in a website or a multi page app. As long as these pages have the same origin and path, they will share a serviceworker. Another option is to serve a render the dynamic content and then let the client take over rendering this lets any crawler see and index. All of your content.

You can use these solutions with any crawler since there’s no JavaScript involved, and if you want your app to be indexed everywhere, you’ll have to render it on the server. You can write code that renders on the client or as server-side JavaScript, it’s called isomorphic JavaScript, but that assumes you’re using node or another JavaScript server. And if you want an easy test, you can run lighthouse.

It includes some basic SEO. Discoverability tests lighthouse runs some basic SEO tests as if you have an HTML only crawler each test has instructions for fixing or improving shortcomings. Okay, so the universal answer is not to depend on JavaScript, but Google’s crawler can run JavaScript. So you can index client rendered sites. As long as you follow some rules, there are about a dozen rules, but the top five will take you most of the way we’ve already covered.

The first rule make your content crawlable. That means rendering it so the crawler can find it. If you’re writing a single page app, the top five rules become these top five tips. Many developers provide navigation links with a hash for the URL and use a click listener. Instead, these should point to actual paths in your app to trigger changes. You also need to avoid URL fragments the part that begins with a hash sign these break many tools and libraries and are now deprecated.

We used to recommend hash-bang prefixes for crawling a jet-powered sites as a way to change URLs without reloading the page. But now you should use the history API. Instead, the next rule is to use canonical URLs for duplicate content. For example, amp pages normally have a server rendered page and the client rendered amp page. The client rendered page has a link back to the server rendered page using the rel equals canonical attribute.

The crawler will index the canonical server rendered page some developers, even shadow, their client rendered pages, with server rendered pages and use the canonical link to point back to the server. This makes more of the app discoverable tip. Number 4 also gives you great accessibility, use the native HTML elements whenever possible. Crawlers know what to do with an actual button, but won’t recognize a div of class button in the same way, finally use progressive enhancement, use polyfills, where it makes sense to support older browsers.

You never know which version of a browser is used in a particular crawler, so play it safe. Some simple changes can improve your data quality and give users much better results. One is to use the schema.Org annotations for structured data there, a predefined schema for common areas, such as e-commerce, scheduling and job postings search engines, can use the schema annotations to parse your data accurately.

The same logic applies to the Open Graph protocol, which allows any web page to become a rich object in a social graph. Finally, the Twitter cards provide a rich media card that displays, when anyone links to your site from Twitter, it’s important to test your work and work iteratively. So the you can see the effects of each change. Testing on multiple browsers is not only a best practice for everyday development.

It ensures your site renders correctly on multiple crawlers testing with the Google webmasters search console will crawl your site and show the result, and you should always pay attention to loading performance. Use tools such as PageSpeed insights or webpage tests to measure the loading performance of your site remember about 40 % of consumers will leave a page that takes longer than 3 seconds to load.

Of course, the most important rule is to treat client-side rendering as a progressive enhancement. If you test on a range of browsers, you’re, probably fine. If you want to be certain, you can use the fetch as Google tool on the site. If that went by a little fast see the Google Webmaster central blog for the details on how to make your PWA search ready. Then come back here and I’ll. Tell you how to measure user engagement in your PW A’s thanks for reading


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.

 

1 reply on “Indexing your PWA (Discoverability & SEO) – Progressive Web App Training”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.