Categories
Online Marketing

Service Workers – The State of the Web

My guest is jeff posnick, he’s on Google’s developer relations team and today we’re talking about service workers and how they’re elevating the capabilities of progressive web apps. Let’S get started all right, so Jeff, thanks for being here in the context of web technologies. What does it mean for a worker and what does it actually do so? The whole idea of a worker has been around for a while.

Traditionally there were web workers and it’s basically serves as almost like a background thread for the web, so a worker can execute JavaScript code, that’s kind of independent from the context of your actual web page and it’s a great way to kind of offload processing or I Do tasks that might take a certain amount of time without slowing down the main thread for your web page and yeah, that that’s kind of should been the traditional model for workers on the web.

So now what does it mean for a Service Worker? What does that? Actually do the service workers builds kind of on that concept and adds some superpowers really things that you were not able to do before so a service worker is similar to worker and that it’s, you know, running independent from your actual web page and it doesn’t have Access to things like the Dom you know or the global scope of your web page, but unlike workers, it could respond to specific events and some of those events relate to network traffic.

So one of the really cool things and most common use cases for a Service Worker is to respond to outgoing Network requests that your webpage might be making, and you can kind of sit in between your webpage and the network and almost serve as a proxy that You control and you could write code to take advantage of things like the cache, storage, API and say hey. You know, I know how to respond to this particular request without having to go to the network.

I could just use this cache response and thereby saving you know the uncertainty and unreliability that comes with coming against the network. It also enables capabilities like push notifications, etc. Yeah so there’s a whole bunch of kind of event based listeners that you can set up in the Service Worker, including responding to portion of vacations. That may come from a notification server and you know fetching requests and people other kind of interesting things are kinda slated for the future as well.

So what’s the status of its implementation and support? Yes, the service workers are well supported right now in modern browsers. So pretty much anything Chrome or chromium based, Firefox, Safari and edge at the moment, it’s great. They all have at least a basic level of support for service workers and some of the enabling technologies, like the cache storage API, so they’re they’re ready to use right now.

So web sites may experience Network reliability issues at any. Given time, would you recommend service workers for every website? Should they all be using one? Well, I mean it’s tempting to just throw a service worker up and see what happens. I would suggest to take a little bit more of a considerate approach before adding a Service Worker to your web app. Ideally, a service worker will kind of play the same role that your web server would play and maybe share the same logic for doing routing and templating that your web server would normally respond with.

And if you have a setup where, like your web server, for instance from a lot of single page apps, the web servers just can respond with some static HTML that could be used satisfy any sort of request. That’S pretty easy to map into a Service Worker behavior. We call that the app shell model or a service work role say: hey. You know, you’re navigating to XYZ URL. I could just respond with this HTML and it’ll always work.

So that’s a really good model for using a serviceworker. If you have a single page app we’re also seeing some success with partners or using models where their servers implemented in JavaScript, they have some routing logic and they have some templating logic. That’S on JavaScript, and that translates over really well to the serviceworker as well, where the serviceworker you just basically fill the role that the server would normally play.

I would say if you have a scenario where your back-end web server is doing a whole bunch of complex templating and remote API calls and language that is not JavaScript. It really might be hard to get your serviceworker to behave exactly the same way. So in those scenarios I mean you can add a serviceworker and we have some kind of provisions in place to not pay the price of having that serviceworker, intercepting all requests and then not doing anything and just going on against the network.

There are waves of saying, hey, you know we have a serviceworker, but we’re not going to be able to respond with HTML for navigation requests. In those scenarios it is still possible use the serviceworker for things like ok, show, custom offline page when you detect that a user’s network connection is down or implement a kind of interesting caching strategy, like still while revalidate for certain types of resources.

So it is still possible to add a serviceworker in those cases, but you won’t necessarily get the same performance and reliability benefits that you get when your serviceworker really respond to all navigations with HTML by essentially having a network proxy juggling requests and responses. Is there a latency cost of having a serviceworker yeah, so I mean you’re you’re running JavaScript code, that’s sitting in between your web app and then a work and that’s not for me.

Some of it depends upon whether the serviceworker is already running. One of the kind of neat features about a serviceworker is that just it’s particularly to preserve battery on mobile devices? It’S killed pretty aggressively. It doesn’t just keep running forever in the background. So sometimes you do have to startup the serviceworker again and there is a cost involved in that startup. There’S a really good talk from the chrome dev summit that just happened a couple of months ago that kind of goes into some metrics and real-world performance.

Timings of you know exactly how long it takes to startup a serviceworker, seeing tens to hundreds of milliseconds depending upon the actual device and things like the storage beautiful device. So you are going to be paying that cost. Potentially, when you’re using a serviceworker – and you know again – that’s really why it’s important to make sure that you have a strategy in place for responding to requests, hopefully by avoiding that work and just going against storage API.

Ideally, and if you’re doing that, then you should see the service worker give you an that positive in terms of performance, you know paying tens, maybe even hundreds of milliseconds is nothing compared to the multiple seconds. Simply didn’t see that you might expect from making a network request each time you navigate to a new URL right. What’S the saying the fastest request is the one that you never need to make indeed yeah.

So what are some anti patterns that you’ve seen the way that people have implemented service workers? There’S a lot of power involved in using a Service Worker? It is just JavaScript that you could write that will pretty much do whatever you want, so you can do all sorts of crazy things, some of which are kind of cool as proof of concepts, but not necessarily things you want to deploy to production in terms of The things that we’ve seen kind of as pain, points or things that are pretty easy to, unfortunately get wrong when implementing a Service Worker.

I think one of the things that it’s most common is caching requests and responses, as you go without having any sort of upper limit on the amount of data that you’re storing. So now you can imagine a website that maybe has a bunch of different articles. Each of those articles has images it’s pretty easy to write a serviceworker that just intercepts all those requests and takes the responses, save some in the cache, but those cached responses will never get cleaned up by default.

There’S not really any provision in the cache storage API for saying you know stop when you reach 50 or 100 entries, or something like that, so you could very easily just keep using up space on your users devices and potentially use up space for things that are Never going to be used again, you know if you have an article from a week ago and you’re caching, all the images and that article that’s kind of cool.

I guess if you’re going to be visit article immediately, but if it’s a page that users never going to go to again, then you’re, really just caching things for no reason. I would say that really one of the important things before you implement your serviceworker kind of have a strategy for each type of request and say: here’s my navigation requests that are being made for HTML; here’s how I’m going to respond to them here.

The image requests. I’M making you know, maybe it doesn’t make sense to cash them at all, or maybe certain it only cache certain images and not others. So thinking about that – and that really just means getting really comfortable with the kind of network info panel in the browser’s dev tools and just seeing the full list of requests are being made. You know. Sometimes your web app is making requests.

If you don’t even realize it’s happening and it’s coming from the third-party code and your service worker ends up seeing that too, so you want to make sure that you know what your service work is doing. You know what your web app is doing and just one other. I would know that a lot of times and kind of pain, point and things that could go wrong with me using a service work, but just has to do with controlling updates to resources.

So you know you are stepping in between. You know your web app and a web server you’re responding, potentially the cached resources, if you’re not sure that those cached resources are being updated. Every time you make changes to your actual website and you read – apply to your web server, it’s possible that your users will end up seeing stale content kind of indefinitely, and this is a trade-off like seeing stale content, but avoiding the network gives you performance benefits.

So that’s that’s good for a lot of scenarios, but you do need to have a provision in place for updating and making sure that you know. Maybe the user sees still content then the next time they visit the site. They get fresh content. So you know you could do that right. Unfortunately, you could get that part wrong and the users can end up the frustrating experience. So you maintain a tool called work box j/s.

What is that? What does it do sure so? Work box is open source and a set of libraries for dealing with service workers and kind of all aspects of building service workers. So we have some tools that integrated with build processes, including you know we have web pack plugin. We have a command line tool. We have a node module and that aspect of the tools, basically, is something you can drop in your current build process and kind of get a list of all of the assets that are being produced.

Every time you rebuild your site along with kind of some fingerprinting information like say you know, this is a particular version of your index. Dot HTML work backs will keep track of that for you and then it will efficiently cache all of those files that are being created by your build process for you and that just helps ensure that you don’t run into scenarios like I just described where you’ve rebuilt.

Your site – and you know you never get updates to your previously cached resources and we also have some tools as part of work box, that kind of harm or execute at runtime. That’S part of the serviceworker, so some libraries for doing common things like routing requests. We have there’s just kind of some canonical response strategies for dealing with caching, so things like still while we validate or going cache.

First, we have implementations of those strategies inside of work box, and then we have some kind of like value adds on top of what you get with the basic serviceworker spec in the cache stored specs. So we actually have an implementation of a cache expiration policy that you could apply to the caches that would otherwise just grow indefinitely, but using work box you could say, hey. You know it actually like to stop.

When I reach ten items and purge the least recently used items and just cache when that happens, and a few other kind of ran two modules, we see it as a bit of a kind of grab bag for all the things that somebody might want to do With a serviceworker and we kind of ship them as individual modules, you can choose the ones that you think would be useful for your particular use case. I don’t want to use something, that’s fine, you don’t have to incur the cost of you know downloading it or anything like that.

Do you foresee some of those caching and expiration policies making their way back into the cache storage API yeah. I mean it’s kind of interesting whenever you have something: that’s almost like a polyfill for some behavior on the web. You know whether that ends up being implemented back into the standards, and you know the the actual runtime could just fade away and just use the underlying standards.

And you know I’d like to see that. I think that where cost has been really great for kind of enabling folks to ship service workers in production and seeing the types of things that they actually need, when you’re shipping somebody in production and a lot of times when you could do that and get points. As a vision thing like yeah, you know it is actually important to have run time, cache expiration.

That could then be used. You know when going to different standards, groups and saying hey, we really do need to extend. You know, what’s supported natively in the platform, to take care of this really common use case. You know what that actually happens or not remains to be seen, but you know I think work box is positioned to help folks with kind of that initial, proving that these things are necessary stage kind of take it from there.

So, in terms of adoption, according to the HTTP archive, less than 1 % of websites tested actually include a serviceworker which is kind of a misleading number. For two reasons. The first is that it’s actually growing at a very fast rate and the websites that do include it are actually pretty popular websites. Can you give us some examples of those yeah? So I think you know the raw number of URLs unique URLs might be on the lower side, but I think in terms of traffic, you know sites as big as Google search have deployed a serviceworker for some types of clients.

You know partners that we’ve talked about using work box, in particular in the past and Gleevec Starbucks has a nice progressive web app, that’s implemented Pinterest as well, and there’s also some sites that you might have heard of like Facebook and Twitter that are using service workers. Not using work box but using them to kind of unlock things like you know, they’re progressive web app experience – or you know in some cases just showing notifications, which is important part of you know being on the web and having parity with native apps.

So you know, I think that the actual number of you know you visits to web pages is probably much higher than the 1 % number would indicate, and you know I mean there are challenges with adding a service worker into especially legacy sites. You know it does. Take that coordination that we talked about before tree, making sure that your service worker actually is behaving in a similar way that your web server would behave and yeah that doesn’t always fit into existing sites.

So a lot of times we’ve seen when working with partners in particular, is like you know: you’re planning a rewrite, re architecture of your site anyway, that’s a great time to add a service worker in and just kind of take care of that story as well. Are there any options for CMS users who may be using things like WordPress or Drupal? So there definitely are, and I think that you know first of all, I’d work for everybody back to another talk from the most recent chrome dev summit.

That really goes into some detail about the WordPress ecosystem in general, so they have a really cool solution, some folks from the dev rel team that Google have been working on it and I think it kind of works around one that that problem. I was saying where the architecture for your kind of back-end web server needs to match up with the serviceworker implementation I kind of just sending a baseline.

So it’s not an attempt to take any arbitrary, WordPress site that might be out there, which might be executing random PHP code depending upon you know what kind of themes and extensions and all the other stuff is going on. You really are not going to be able to successfully translate that into just a general-purpose serviceworker, but the approach that was subscribed and this talk. It seems to be building on top of a kind of a common baseline of using the amp plugin as a starting point.

So any site that has gone through the effort of kind of meeting all the requirements for using the amp plugin. So it means I don’t know the full set, but I think, like not running external scripts, not doing anything too crazy with other plugins. That’S inserting random HTML on the page building. On top of that, you can then have a serviceworker. That’S like okay. I actually do know how to handle this subset of activities that you know WordPress is doing when it’s using the unplug in and it can automatically generate that serviceworker for you.

So again, it’s part of a migration story. I think it’s not going to just drop into any existing legacy WordPress site, but it does give a nice path forward for folks who are planning on rewriting anyway are planning on making some changes anyway, and plugging into the CMS ecosystem is great way to increase adoption By tens of percents on what yeah absolutely so, what kinds of resources would you recommend for someone who’s just getting started with service workers? We have a lot of material available, some of which is more recent than others.

I would say that the things that I worked on most recently are the resiliency section of web dev. So if you were to go there kind of have something I would walk you through the various steps of thinking about adding a service worker to your website or just really thinking about making your website more resilient in general. So it’ll talk about you know identifying your network traffic it’ll talk about using the browser’s HTTP cache effectively, which is kind of your first line of defense, and then it all kind of go into how you could add work box to an existing site and the various Steps involved there, so if you want kind of a guided path, I would say that’s one option we’ll biased.

For that. I would say that if you want to just learn more about service workers in general and material written by my colleague, Jake Archibald, it’s probably the best that for folks who really want to deep dive on things, he was somebody who worked on the actual serviceworker specification And you know he knows more than anybody else about these things, so he was a really great article talking about the serviceworker lifecycle, just all the different events we have fired, and you know how you have to handle those events differently and implications that they have for You know the state of your caches and updates, and things like that so diving into that would be kind of my recommended starting point, and he has another article that talks about kind of a cookbook almost for recipes for caching, so implementations of the stove are valid.

A pattern cache first pattern: if you wanted to implement it yourself, instead of using work box, he kind of walks through the process. There is that the offline cookbook, yes, the offline cookbook, and if you want something, that’s really offline, there’s some actual physical books that that are pretty cool, related to service workers and progressive web apps in general. There’S a new book written by Jason, Grigsby, eight in particular, that I would recommend and just kind of talks a little bit about, I’m necessarily some of the technical aspects of service workers, but more about why you should think about adding a service worker to your site And why you might want to build progressive web app in general and that’s a really cool book, that kind of takes it from a slightly different angle, but gives some good perspective great Jeff.

Thank you again for being here. Absolutely you can find links to everything we talked about in the description below thanks a lot and we’ll see you next time.


Website management packages are important for any business these days. Check out the video from Allshouse Designs to see what can be done for your company and yes, for how much. 

 

Categories
Online Marketing

Web Performance: SEO Mythbusting

Myth busting with me today Is Ada Rose, Cannon and you’re working for Samsung. Is that right? What do you do at Samsung? So for Samsung? I’M a developer advocate for the web browser Samsung Internet Samsung Internet is a web browser for Android phones.

You can get it from the Play Store, but not a lot of people have heard about it. So there’s lots. What I do is trying to raise awareness, but more importantly than that, What I’m trying to do is advocate for the web as a platform to try and encourage developers to build for it and to Make sure it lasts long into the future. As a great and healthy platform for people to build stuff with, I love to have you here, because I want to talk to you about the SEO Versus performance and usability on the web, and I think we need to get some stuff out of the way right.

So will you say what is the most important bits and pieces that you would like people to focus on more when building web stuff? So I have a huge passion for ensuring that That the web remains great for everyone around the world, Not just on people using the latest handsets and on desktop computers, Because most people aren’t people are using devices from years ago and low-end sub $ 100 devices, where, Frankly, today, the the modern web is just not even reaching them There’s a fantastic talk from Alex Russell Who goes into the the reality of people with phones that are less than $ 100 And yeah.

That’S a that’s! A fantastic one. You’D have the naive thought that, as time goes on, that phones are getting steadily better and at a bottom of the line phone is Nowadays is just as good at the top of line phone four years ago, when they’re, not It’s just getting wider and wider Wider. The chasm is opening, rather than anything else, What was really awesome. I heard recently Google was doing performance metrics into Into their ratings for search results.

I don’t So was this front-end web performance like render speed, making sure it’s not janky, or is this just Making sure that a page loads really quickly? So it is a little… It’S a it’s a tricky one, because we have so many metrics right. We have Time to first bite. We have time to interactive, We have time to first meaningful paint, and then you have like the frame rates and stuff.. Now Googlebot, which is the tool that basically fetches the data and renders your website for for Search Indexing.

We don’t really interact that much with the page, So we can’t really get like figure out if your scroll is smooth on something like that, but we do get the rendering bits. So we can tell you, when the page becomes responsive to inputs, when the content is ready for the user to consume, So we’re looking at the blend of these kind of modes Yeah of performance. Does that make sense? It does make sense.

So do you have any other qualms with like how SEO influences the daily work of a web developer? So a friend of mine recently rebuilt her site using React. She was very excited about it and and seemed to get quite good to client side performance Once it all loaded. Unfortunately, when she sent it out to her company’s Team to which does SEO analysis, they came back with an answer of. We love your site.

It’S really good, But you’ve basically don’t appear in the rankings, even though she could show them that look right there. It’S on Google Is Google engaging with people who do SEO analysis to ensure that They’re running up-to-date metrics, the similar ones to Google to ensure that even a heavily client-side rendered page, and They can feel confident that it is being measured. Well, So we can’t really Fix what people are doing in terms of what they were tools.

They’re using or something, But what we do want is we want to open this black box of SEO for everyone, So we’re having this conversation web developers We’re having this conversation with Seo and tool makers and we provide a bunch of metrics and tools as well. So we have search console that gives you a bunch of insights and how you’re doing in search so that you’re, not relying on someone else. Basically, sticking the finger in the wind and Reading the stars and stuff, and we also.

We also want to make sure that people are understanding that blanket statements like JavaScript’s going to kill your SEO or you cannot use, React or Angular that that’s not necessarily the best way of doing it. It’S a really comfortable answer, probably mmm. It’S not the right answer. Sometimes All right so at Chrome, Dev Summit. I saw your great talk on SEO in the web. Thank you, And one thing you mentioned was the the rendering for By Googlebot to actually process a JavaScript heavy site could take up to a week to happen.

Does this mean that JavaScript heavy sites are effectively getting penalized in Google Search results right, They’re, not getting penalized, so they are ranking just fine, but the indexing stage is where the problem is because, as you say, we are processing by putting them first into a rendering Queue and then eventually, As we have the resources available, We are rendering them and if the resources take a while to actually render That means that we cannot refresh the content in the index as quickly so News sites might want to look into that.

But then again you have usability issues anyways right, Yes right and that’s because that’s bad for the user. We try to find search results that are good for the users and If a page takes ages to load. That is not a good experience for me. So you want to fix that because of the users, not necessarily just because of the crawler. So if a page is Built using, I know I have a bit of a bias against these JavaScript heavy front-end client-side Rendered pages, because they’re terrible for everyone who doesn’t have like an iPhone or latest Pixel, or something Yeah or a desktop computer.

But anyway, for these sites, if the way they make their money is delivering fresh content daily. Does this mean that the content in the search results may actually be like out of date? For them They might be lagging? Then? Yes, absolutely And I think again, like it’s very important to get the users a great experience, and I don’t think you can do that when you are Heavily relying on client-side rendering, because good Devices might be really old, So yeah one way of working around.

Unless you want to Properly fix this and do hybrid, rendering or server-side rendering One way around of that is to do dynamic, rendering and basically like give us a static rendered version of your page for the crawler So that we can index it quicker. But that’s not making the user usability and user experience problems going. So what do you say it’s generally safer to rely more on on latest HTML and CSS, Knowing that they degrade more gracefully than JavaScript? Yes, Don’t speak! If you look at the the tristar of Technology that we have in the web platform like HTML, CSS and JavaScript, HTML and CSS are just more resilient than JavaScript, and so Relying on JavaScript, too heavily is always going to probably get you into trouble with certain ways And spotty network connections and stuff, So I would say, use polyfills use, progressive enhancement, use what the web platform offers you and use JavaScript responsibly Yeah.

It’S really great to hear, especially from a Googler that, like reducing reliance on JavaScript and Taking advantage of good HTML and CSS, where it’s available can can actually wonders for your SEO. Absolutely Ada. Thank you so much for being here and talking to me about performance and SEO, and I do you have a feeling that SEO and web developers can work together nicer or is there still…? I think, as long as the goals of what people are trying to accomplish are clear and we’re not just like resorting to auguries or looking at the stars to work out what Google is thinking, then it’s going to Enabled developers to actually build sites that make sense And take advantage of that platform, Anything Google can do to ensure that the web works for everyone and not just in the wealthy Western web, then It’ll be really really fantastic.

Fantastic closing words. Thank you so much for being here. Thank you This just in the next episode of SEO. Myth-Busting is going to be about SEO in the age of frameworks, Jason Miller, and I will talk about what that entails. So stay tuned on this blog Subscribe to Google webmasters and see you soon.


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.

 

Categories
Online Marketing

SEO Mythbusting 101

I don’t know about people say a lot of things out there about how to make your website it stand in the top result, but I don’t really know how to achieve that. You know right, fair enough. That’S a that’s a really good question: how to achieve that and I think that’s a perfect introduction into what we’re trying to do here we’re trying to like bust these myths.

What can I help you with? What are the questions that come to your mind? Okay? So, let’s start with something simple: what it’s a search engine all right, so a search engine is a platform of service or program whatever you want to call it that basically goes through the internet content and tries to catalog it. It works a little bit like in the library right, so you probably go to a library and ask the librarian.

Where can I find a book on topic X right, that’s what you do and then normally it doesn’t take you to basically go through all the books. In the library you just ya, get the right books and that’s what search engines do for you. We find the right content for your purpose, all right, but I went when he’s heard of search engines. I also heard this word called crawling. Is that a thing? That’S a thing so the way that we are doing this or search engines do this is by first going through the entire internet and we have links from one page to the other yeah.

So we are using that we start somewhere some URLs and then basically follow links from there on. So we are basically crawling our way through the Internet, one page by page, more or less, and then once we have, these pages have found them have grabbed the content. From the Internet, we need to understand it. We need to figure out what is this content about and what purpose does it serve? So then that’s the second stage which is indexing.

So then we figure out. So this page is about ice cream. This page is about ice cream in Miami. This page is about marmalade and stuff like that, and then the last step is, if you type something in you don’t type in. I want this particular thing here. You just go like. I need ice cream ice cream online. Midian right, yes, you got it, so we then basically look into our index and find the ones that are serving this purpose.

And then we try to figure out, which is the one that serves these purposes perfectly or best, and then we rank these higher than the others and show you the example, the examples that we found from the index. So how do you know which one is which results are more relevant to a given user? That’S a really good question. We have over 200 signals to do so. So we look at things like the title: the Meta Description, the actual content that you’ve got on your page images links all sorts of things.

Well right, it’s a very complicated question to answer what ranks you best but yeah. We look at the bunch of signals now, if you could give me like. You know, like top three things, that I should consider. What would that be right, so us being developers originally, you probably want me to say. Oh, I use this framework or use that framework. Yeah, that’s not how it works. You have to have really good content, and that means you have content have to have content that serves a purpose for the user.

It’S something that users need and or one optimally they need it and want it. Okay, like ice cream, so if you’re, if your content says where you are, what you do, how you? How you help me with what I’m trying to accomplish? That’S fantastic! If you just have a page that says like we are a fantastic company and we have plenty of products, that’s not serving a purpose, so you want to make sure to serve the purpose of the people who you want to attract and get who you want to Interact with your content, and you want to make sure that you’re using words that I would be using if you use a very specific term for your ice cream, let’s say like smooth cream 5000 or something like that.

I’M not I’m not going to search for that because I don’t know about I’m just going to go like I need ice cream, it’s good to mention it somewhere, so that I know. If I look for that trademark, I find it as well okay, but if I, if I’m exploring ice cream around me, I don’t know what particular ice cream there is, if there’s like a specific brand fantastic, but that’s not what I’m looking for so speak the language That I’m using so you’re you’re saying more.

Like a page, it’s like an exactly you wouldn’t when when we to meet and you have a fantastic product or I have a fantastic four, I wouldn’t go like yeah blurp master 5000. It’S fantastic and you’re like yeah. It doesn’t say it does that do all right, so do that, do an elevator pitch and help us. Okay, put you in contact with the right people, so content is number one priority. Oh, could you mention another two things that are important for this yeah you’re going to love them because they are technical, so the second biggest thing is make sure that you have meta tags that describe your content.

So I have a Meta Description. Okay, because that gives you the possibility to have a little snippet in the search results that let people find out, which of the many results might be the ones that help them the best and have page titles that are specific to the page that you are serving. So don’t have a title for everything. The same title is bad. If you have titles that change with the content you’re showing that is fantastic and frameworks, have ways of doing that so consult the documentation, but there’s definitely something something that helps with the content and the last bit is performance.

Herot right, yeah performance is fantastic. We’Re talking about it constantly, but we’re probably missing out on the fact that this is also good for being discovered online. Our so performance is not just making my website faster, but it’s also making my website more visible to others, correct okay, because we want to make sure that the people clicking on your search was like clicking on your page yeah, getting this content quickly.

So that’s one thing that we want to make sure as well, so we’re it’s one of the many signals that we are looking at, but also it just helps you use this right. They get happier. If I want ice cream really badly, then I get the page quicker, that’s fantastic yeah! So if you want to look at performance, I highly recommend looking into hybrid rendering or server-side rendering again, because that gets the content quicker to the users.

Usually right also, you might have BOTS that don’t run JavaScript so Googlebot. Does that, but not everyone else. Does it necessarily? So you want to make sure to probably figure out something like dynamic rendering, if you don’t want to make code changes, because I understand we’re all pressed for time. We have lots of bugs and and features too to fulfill and work through. So if you can’t change the code dynamic, rendering might be something that gets you there.

Okay, if there’s rendering shoes with your content. But besides that, I would say definitely look into performance. Optimization get the content quicker, get the first content full paint in there quicker optimize. Your servers optimize your caching strategies make sure that your script doesn’t have to run for, like 60 seconds, to fetch everything that you need. I know yeah, so those are things that you should definitely look into, and I guess performance is something that pretty much everyone in the developer community is looking at.

Certainly yes or they should at least they should. I hope that they do okay, so we already discussed, like all these basics around SEO and search engines and how to position my my website in the top search results. Now the question is: why is it so important for companies to rank like like in the top results right, so you’re you’re a web developer right? Yes, your build stuff on the internet. Yeah.

Do you want people to use it? Certainly, yes, certainly right, so in order to make sure that people can use that they have to know about it, and unless you are probably one of the really big players might not, and even for the big players, if they launch something new, you might not know About it and you’re not looking specifically for products you’re looking for something that serves a purpose for you, okay, I want to know how I built this thing with a framework I want to know where to find the best ice cream and the place I am in.

I want to find the cutest dogs and poppers online so, like I have a purpose, I don’t know who serves this purpose necessarily. So if you build the best ice cream, PWA ever in, let’s say Medellin. Is that how you profess? So if you build the best PWA to order ice cream online in midian, then I don’t. I don’t know about that, especially if I come as a tourist. But if I type that into a search engine like order ice cream in medicine, and then it goes like hey this, this PWA does this trick yeah you want to be the the first or the first couple of because I’m not going to go to page 99 And go like oh yeah.

This might be the perfect thing, because Google and other search engines are trying to like figure out what is the best for this purpose and then show me those up front and then I might pick from those because normally they’re pretty good. I think that covers have all the questions I have fantastic, so you feel like ready to build that, certainly excellent. That is so cool. Thank you. So much for being here.

Thank you, my guests, and I hope that this this helps other developers as well and developers and se owes can be friends. I think I think so yeah I think so. Thank you. Oh, are we still on please stay tuned for another episode of SEO: myth busting. Next time with soos Hinton we’ll talk about what is Googlebot so come back again and read what happens?


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.

 

Categories
Online Marketing

Essential JavaScript SEO tips – JavaScript SEO

Let’S look at a few SEO techniques to help users find your content. All of your pages should have a descriptive and helpful title that describes what the page is about in very short terms, for example, on recipe pages avoid using a generic titles such as Barbara’s baking block.

Instead, each page should have the name of the recipe in the title, so it’s clear what the page is about. You also should provide a description of what the page will contain, specifically, for example, what makes this recipe special on what are its main characteristics? So people have something helping them identify the best page to fulfill the intended goal. Both of this can be done by adding a title and meta tags in your markup.

You can check your pages for those tags by using the right-click inspect and then search for double slash, title and double slash meta to find them. If you do not see all of your content in the markup, you are probably using javascript to render your page in the browser. This is called client-side rendering and is not a problem per se. Rendering is the process of populating templates with data from api’s or databases.

This can happen either on the server side or on the client side, when it does happen on the server crawlers, as well as your users get all the content as HTML markup immediately in single page apps. The server often sends the templates and Java scripts to the client and the JavaScript then fetches the data from the backend populating the templates as the data arrives, as explained in the first episode, the indexing for JavaScript sites happens in two waves content.

That requires JavaScript to be fed, we’ll only be indexed in the second wave, which might take some time in later episodes. We will cover how to overcome this and often also improve the user experience and loading performance by using techniques such as dynamic, rendering hybrid rendering or server-side rendering for single page apps. Another important detail is to allow Googlebot to crawl pages from your website by linking between your pages properly, make sure to include useful link anchor text and use the HTML anchor tag with the destination.

Url of the link in the href attribute do not rely on other HTML elements such as div or span or use javascript event. Handlers for this now only crawlers will have a trouble finding and following these pseudo links, they also cause issues with assistive technology. Links are an essential feature of the web and help search engines and users find and understand the relationship between pages, if you are using javascript to enhance the transition between individual pages use.

The history API, with normal URLs, instead of the hash based routing technique using hashes, also called fragment identifiers to distinguish between different pages, is a hack that crawlers to ignore using the JavaScript history. Api, on the other hand, with normal URLs, provides a clean solution for the same purpose. Remember to test your pages and server configuration when using javascript to do the routing on the client-side.

Googlebot will be visiting your pages individually, so neither a service worker nor the JavaScript using history. Api can be used to navigate between pages test. What a user would see by opening your l’s in a new incognito window, the page should load with an HTTP 200 status code and all the expected content should be visible. Using semantic HTML markup properly helps users better understand your content, as well as navigated, quicker assistive technologies like screen.

Readers and crawlers also rely on the semantics of your content, use headings, sections and paragraphs to outline the structure of your content, using HTML image and article tags with captions and alt text to add visuals. You have crawlers and assistive technology to find this content and surface it to your users. In contrast, if you use JavaScript to generate your marker, dynamically make sure you aren’t accidentally blocking Googlebot in your initial markup, as explained in the previous episode.

The first round of indexing does not execute JavaScript having markup such as a no index meta tag in the initial payload can prevent Googlebot from running the second stage with JavaScript. Following these steps will help Googlebot understand your content better and make your content more discoverable in Google search, hi Googlebot hi Googlebot. Did you see the new webmasters article serious? No, I did not what you missed out on so much stuff.

Really, yes, oh no. How could we have prevented that well subscribe and follow our articles?


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.