Categories
Online Marketing

JavaScript: SEO Mythbusting

Where do these come from? How do these get into the world The myths, the legends that come through now JavaScript? I think a lot of it is people with very good intentions will try to provide the information they have available and there’s a gap in translation between the SEOs and the developers and how they think and what they consider So by going ahead and adopting is acceptance Criteria as part of my tickets, when I work with devs that lets them know very specifically, instead of being like “, and I want you to make magic for me” And you go from “.

Give me magic” to “, hey here’s, my user story,.” “. I would like to accomplish three pieces for acceptance criteria..”, You can bridge the gap, Hello and welcome to another episode of SEO. Myth busting With me today is Jamie Alberico Jamie. What do you do in your job? Thank you so much for having me here. I’M a technical SEO with Arrow Electronics. That means that I am embedded with a number of dev teams across a number of projects, And we tried to execute these initiatives, get new features available on the site in an effective and search friendly way, And that means a lot of times.

We have to have conversations about how we’re using our JavaScript Having you here is fantastic, because then we can have a conversation about pretty much everything that you want to know from the search side, as well as the web developer side. So… Any questions that you have in mind or anything like pops into your mind. Oh so many questions. I hope I get to poke at the black box of Google here And I have one.

That’S absolutely burning Is JavaScript the devil. That’S a fantastic question. It might seem that way, sometimes, especially when things are not going the way you want. You see the horror stories They’re on forums or on Twitter. Everything is gone. Yeah, that’s one thing: That’s the SEO site on the developer site is also like. Oh, it’s a language that wasn’t designed to be like super resilient, But it actually is and then often people are oh, It’s a C style type language and it’s not really.

It’S a list type language They’re, like a lot of misconceptions, coming from both worlds together and clashing here. I don’t think it is the devil. I think it has its benefits. I mean it allows us to build really cool and fantastic stuff on the web and be really responsive to what the user does and wants to do with our applications. And it has moved the web from becoming or being a document platform towards an application platform.

And I think that’s fantastic, So I think we are already pushing hard on fighting this “ JavaScript is the devil,.” And “. If you use JavaScript, we can’t be indexed at all.”. So that’s not true for for a long time, But I think now the documentation is catching up with like outlining the different Bits and pieces that you should be aware of and the features that you have to deal with that are not available.

One thing, for instance, is you probably have built single page applications right? Oh, yes, Has there been problems in terms of SEO when they rolled out, I I was pretty lucky. I had a dev team who believed in SEO. That’S good, That’s really good. That was actually my the big moment of my career when I got on the technical SEO And I came and I talked to you one of my new developers for the first time with this very specific problem I was trying to solve and he just paused and Looked up from his keyboard and went “ you’re, not snake oil”, So I think we’re making a lot of progress between SEO and devs.

That is fantastic. It’S a great story, So you might hear a few people in in the community going like ooh. Should we do a single page application? Is that risky And one of the things that a bunch of developers are not aware of, and some SEOs are not necessarily communicating all the time is that we are stateless. So that means with a single page application. You have a bit of an application state right, You know which page you are looking at and you how you transition between these pages.

However, when a search user clicks on a search result, They are not having this application. They are jumping in right to the page that we indexed, so we only index pages that can be jumped right into So a lot of the technology. Javascript technology is making assumptions of how The user navigates so the application. So like the developer as a developer. In my test, It’s okay, Here’s my application.

I click on the main navigation for this particular page and then I click on this product and then I see and everything works, But that might not do the trick because… You need that unique URL. It has to be something we can get right to Not using a hashed URL and also the server needs to be able to serve that right away. If I, if I do this journey and then basically take this URL and copy and paste it into an incognito browser Mm-hmm, I want people to see the content, Not the home page and not a 404 page.

So that’s something that we’re working on giving more guidance for lazy loading. You probably have seen a bunch of communication about that. One is probably Yes Yeah. How do we get a rich media experience out to users, but do it in a way where, if you’re on your cell phone, we keep that very small time frame? We have to get your attention Correct and you want to make sure that if you have a long list of content, You don’t bring everything into the especially on the cell phone right, Just feeling, like 100 images, What about Ajax? What about using asynchronous, JavaScript and XML? That is perfect Whoa I haven’t, I haven’t, heard Ajax being used in a while, and it’s fell out in a while.

I mean Everyone’s using it, but no one’s talking about it that much It was just like yeah. You just load data in as you go and that’s perfectly fine. We are able to do that. Also, I often get us about how that affects the crawl budget.., Let’s talk So what worries you about that? Well, if we’re using Ajax and me requests, say a product detail page and we’re using Ajax to supplement a lot of pieces of content to it.

Right, Googlebot’s requested one URL and it’s gotten back nine Yeah, because each of those Ajax calls had a unique string right. How do we handle that and does that negatively impact our crawl budget? So I wouldn’t say it negatively impacts your crawl budget, because crawl budget is much more complex than you might see this It’s one of these things that looks like super simple, but there’s more than meets the eye, We’re doing a bunch of caching right, because we expect That content doesn’t necessarily like update too much.

So Let’s say you have this product page. You make one request to the product page and then that makes nine more requests. We don’t make it. We don’t distinguish between like loading, the CSS or the JavaScript, or the images or the API calls that get you the product details. So if you have nine calls from this one page load, then that’s going to be ten in the crawl budget. Because of caching, we might have some of these in the cache already, So if we have something that is already cached, that doesn’t count towards your crawl budget.

So if we were to version our Ajax calls, yes, those could be cached as those could be cached exactly. Yes and then that’s that’s one way of working around it. If you can do that, if that’s a possibility, The other thing is, you could also consider it not just an issue for The crawl budget, but also an issue for the user right, because, if you’re on a slow network or spotty network connection, It might flake out In the middle – and you were your left foot broken content, That’s not a great user experience.

You want to probably think about, like pre-rendering or hybrid, rendering or server side, rendering Anything in between there And crawl budget is tricky generally, because we are trying to deal with the whole “ host load” situation. So what can your server actually deal with? So we are constantly adjusting that anyway, So it’s like “, oh this affected our crawl budget negatively.”, Not really because we just like had host load issues with your server, So we like adjusted it anyway, so we had balancing issues across your entire content.

So I wouldn’t say that it’s not much of a deal, But I see that it’s very important for people to understand that and unfortunately that’s not that easy. Can we demystify Googlebot a little bit Because we have this, The omnibus, the great the Googlebot, but it actually goes through a series of Actions. So we get that initial HTML parse. We find that the JavaScript and CSS that we need to go ahead and make our content then call those pieces.

We know. Since Google I/O there is actually a gap between our initial parse and our HTML rendering. But I want to know more because Googlebot follows HTML. / HTML5 protocols – Yes, There’s some nuances there. I don’t think I know I didn’t know about Where say: you’ve got an iframe in your head and you’ve got a closing head script right there That ends your head for Googlebot Yeah. All of our lovely meta content, our hreflangs and canonical’s below that have a tendency to exist.

.. That is true, there’s a bunch of things at play. So when we say Googlebot what we actually mean on the other side of the curtain is a lot of moving parts. So There’s the crawling bit that literally takes in URLs right and then caches them from the server, then so that when you are providing the content to us, we get like the raw HTML. That tells us about the CSS, the JavaScript And the images that we need to get and also the links in the initial HTML yeah, and because we have that already we have such a wealth of information already.

We can then start it like go off and fetch the JavaScript and everything that we need to render later on, But we can also already use the HTML that we’ve got and say like “. Oh look, there’s links in here that need to be crawled.”. So when you have links in your initial HTML, we can go off and basically start the same process for these URLs as well. So a lot of things happen in parallel, rather than just like one step and then the next step, and then the next step.

So this is definitely the start of it And as we get the HTML in parallel to extracting the links and then crawling these, we queue them for rendering. So we can’t index before we have rendered it, because a bunch of content needs to be to be rendered. First, In a way that better fits us, if we’ve got a single page application, We now.. Googlebot has the template. They just got to grab the content that fits within there Yeah.

So, wouldn’t that mean that Googlebot likes these JavaScript platforms, The more content you get us quickly in the first step in the crawling step, the better it is because we can then basically carry that information over rather than having to wait for the rendering to happen, But Is prerender always the best solution? That’S a tricky one. I think most of the time. It is because it has benefits for the user on top of just the crawlers, But you have to very carefully measure what you’re doing there, I think so Giving more content over is always a great thing.

That doesn’t mean that you should always give us a page with a bazillion images right away, because that’s just not going to be good for the users, Because they’re going to have to then…. If you’re on a really old phone. And I have a pretty old phone and you have a pages full of images and transitions and stuff, then you’re like.. “. I can’t use this website.”. So pre-rendering is not always a great idea.

It should be always a mix between Getting as much crucial content and as possible, but then figuring out which content you can load lazily in the end of it. So for SEOs. That would be. You know we. We know that different queries are different. Intents Informational, transactional,…, so elements critical to that intent should really be in that initial rush Exactly and you might consider if, if the intents are wildly different And the content is very, very different, consider making it into multiple pages or at least multiple views if you’re Using a single page Application so that you have an entry point for the crawler to specifically point at it when when it comes to surfacing The search results, So treat it like a hub and let the users branch out from there.

Yes, so that’s where we’d use! Maybe our CSS toggle for visibility. That is a possibility just having different URLs, is always an option, especially with the history API. You can probably in the single page application figure out which route to display and then like have the content separated between different routes or Be a little more dynamic. There.. We support parameters, So even if you use URL parameters.

. Basically expose the state that is relevant to the user in the URL. What other ways does that benefit our users, because our ultimate goal is to make them happy And that’s our ultimate goal too. So like we are, we are the same in terms of what our goal is. We both want to surface useful information to the user as quickly as possible, So The users benefits are especially if you do like hybrid rendering or the server-side, rendering that They get the content really quick.

Normally, if it’s done well, if it’s not overloading their device And they get to jump in right where the meaty bits are right. So if I’m looking for some specific thing – and you give me a URL that I can use to go to that specific thing – I’m right there and I’ll have a great time, because it’s the content that I needed So yeah. If you have performance metrics going up as well, then, even if I’m on a slow phone and a really spotty network, I still get there.

I mean our performance metrics, that’s based on a lot of pieces. We have a stack of technology. That is true. What should SEOs look for in our stack? Where should we try to identify those areas where we could have a better experience for not just Googlebot, but our humans Yeah? So I think a bit that is oftentimes overlooked, not by a SEOs But by businesses and developers, is the content part. So you want to make sure that the content is what the users need and want, and it’s written in a way that helps them.

But on the technology side,… Wait So that blurb at the top people always do where the like. Here’S my hero image and then 500 words about this thing And I’m a human who wants to buy something and there’s so much stuff in the way…. Yeah. Don’T do it At least like have two pages have like the promotional page that you want to do direct marketing towards and then, if I specifically look for your product, just give me your product.

Just let me let me give you money, So I think Talking about performance and all the different metrics, it’s a bit of a blend of all the things like Look at. When does my my content actually arrive, when does my page become responsive? So you look at First content for pain. You look at time to first buy as well less important than the first content full paint. I would say, because it’s fine, if it takes a little longer.

If, then, the content is all there Versus.. So time to first byte can take a bit of a hit Yeah If we deliver that faster first, meaningful paint Exactly because in the in the end as a user, I don’t care about. If the first byte has arrived quicker, if I’m Still looking at a blank page because javascript is executing or something is blocking a resource… If it arrives a little later, but then it’s right there, That’s fantastic right and you can get there in multiple ways.

I highly recommend testing testing testing testing. What testing tools would you recommend? So I definitely recommend lighthouse. That’S a great way. Web hint is more More broad approach as well, and you could also use PageSpeed insights or the new SEO audits in lighthouse. Mobile-Friendly test also gives you a bunch of information. Pagespeed insights is look at that full page though Mm-hmm and we had a bit of a bit of a gap.

We have almost this futurist Lighthouse, where we want that time to interactive, and then we have people adopt this methodology. That’S how we got you know so much contact via Ajax, because full page load is fast, but all that content was still coming…. I would recommend lighthouse that gives you like the Filmstrip view of when things are actually ready for the user to work with. So I would highly recommend looking at lighthouse but PageSpeed insight Gives you a good like first first view over and it integrates with lighthouse really nicely now.

Wonderful. Do you think that JavaScript and SEO can be friends now and developers and SEO s can also work together? I do I really think that You know if Google is a library and a webpage is a book using these JavaScript frameworks. Lets us make pop-up books, enrichen experiences to engage with. Oh that’s a fantastic analogy. I love that image. That’S a that’s a beautiful one! Thank you so much Jaime.

Thank you very much and I hope you enjoyed it and see you next time. Have you ever wondered where, on the map, you should put UX and performance when you’re talking about SEO, So have I Let’s find out in the next SEO? Myth-Busting episode,


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.

 

Categories
Online Marketing

Web Performance: SEO Mythbusting

Myth busting with me today Is Ada Rose, Cannon and you’re working for Samsung. Is that right? What do you do at Samsung? So for Samsung? I’M a developer advocate for the web browser Samsung Internet Samsung Internet is a web browser for Android phones.

You can get it from the Play Store, but not a lot of people have heard about it. So there’s lots. What I do is trying to raise awareness, but more importantly than that, What I’m trying to do is advocate for the web as a platform to try and encourage developers to build for it and to Make sure it lasts long into the future. As a great and healthy platform for people to build stuff with, I love to have you here, because I want to talk to you about the SEO Versus performance and usability on the web, and I think we need to get some stuff out of the way right.

So will you say what is the most important bits and pieces that you would like people to focus on more when building web stuff? So I have a huge passion for ensuring that That the web remains great for everyone around the world, Not just on people using the latest handsets and on desktop computers, Because most people aren’t people are using devices from years ago and low-end sub $ 100 devices, where, Frankly, today, the the modern web is just not even reaching them There’s a fantastic talk from Alex Russell Who goes into the the reality of people with phones that are less than $ 100 And yeah.

That’S a that’s! A fantastic one. You’D have the naive thought that, as time goes on, that phones are getting steadily better and at a bottom of the line phone is Nowadays is just as good at the top of line phone four years ago, when they’re, not It’s just getting wider and wider Wider. The chasm is opening, rather than anything else, What was really awesome. I heard recently Google was doing performance metrics into Into their ratings for search results.

I don’t So was this front-end web performance like render speed, making sure it’s not janky, or is this just Making sure that a page loads really quickly? So it is a little… It’S a it’s a tricky one, because we have so many metrics right. We have Time to first bite. We have time to interactive, We have time to first meaningful paint, and then you have like the frame rates and stuff.. Now Googlebot, which is the tool that basically fetches the data and renders your website for for Search Indexing.

We don’t really interact that much with the page, So we can’t really get like figure out if your scroll is smooth on something like that, but we do get the rendering bits. So we can tell you, when the page becomes responsive to inputs, when the content is ready for the user to consume, So we’re looking at the blend of these kind of modes Yeah of performance. Does that make sense? It does make sense.

So do you have any other qualms with like how SEO influences the daily work of a web developer? So a friend of mine recently rebuilt her site using React. She was very excited about it and and seemed to get quite good to client side performance Once it all loaded. Unfortunately, when she sent it out to her company’s Team to which does SEO analysis, they came back with an answer of. We love your site.

It’S really good, But you’ve basically don’t appear in the rankings, even though she could show them that look right there. It’S on Google Is Google engaging with people who do SEO analysis to ensure that They’re running up-to-date metrics, the similar ones to Google to ensure that even a heavily client-side rendered page, and They can feel confident that it is being measured. Well, So we can’t really Fix what people are doing in terms of what they were tools.

They’re using or something, But what we do want is we want to open this black box of SEO for everyone, So we’re having this conversation web developers We’re having this conversation with Seo and tool makers and we provide a bunch of metrics and tools as well. So we have search console that gives you a bunch of insights and how you’re doing in search so that you’re, not relying on someone else. Basically, sticking the finger in the wind and Reading the stars and stuff, and we also.

We also want to make sure that people are understanding that blanket statements like JavaScript’s going to kill your SEO or you cannot use, React or Angular that that’s not necessarily the best way of doing it. It’S a really comfortable answer, probably mmm. It’S not the right answer. Sometimes All right so at Chrome, Dev Summit. I saw your great talk on SEO in the web. Thank you, And one thing you mentioned was the the rendering for By Googlebot to actually process a JavaScript heavy site could take up to a week to happen.

Does this mean that JavaScript heavy sites are effectively getting penalized in Google Search results right, They’re, not getting penalized, so they are ranking just fine, but the indexing stage is where the problem is because, as you say, we are processing by putting them first into a rendering Queue and then eventually, As we have the resources available, We are rendering them and if the resources take a while to actually render That means that we cannot refresh the content in the index as quickly so News sites might want to look into that.

But then again you have usability issues anyways right, Yes right and that’s because that’s bad for the user. We try to find search results that are good for the users and If a page takes ages to load. That is not a good experience for me. So you want to fix that because of the users, not necessarily just because of the crawler. So if a page is Built using, I know I have a bit of a bias against these JavaScript heavy front-end client-side Rendered pages, because they’re terrible for everyone who doesn’t have like an iPhone or latest Pixel, or something Yeah or a desktop computer.

But anyway, for these sites, if the way they make their money is delivering fresh content daily. Does this mean that the content in the search results may actually be like out of date? For them They might be lagging? Then? Yes, absolutely And I think again, like it’s very important to get the users a great experience, and I don’t think you can do that when you are Heavily relying on client-side rendering, because good Devices might be really old, So yeah one way of working around.

Unless you want to Properly fix this and do hybrid, rendering or server-side rendering One way around of that is to do dynamic, rendering and basically like give us a static rendered version of your page for the crawler So that we can index it quicker. But that’s not making the user usability and user experience problems going. So what do you say it’s generally safer to rely more on on latest HTML and CSS, Knowing that they degrade more gracefully than JavaScript? Yes, Don’t speak! If you look at the the tristar of Technology that we have in the web platform like HTML, CSS and JavaScript, HTML and CSS are just more resilient than JavaScript, and so Relying on JavaScript, too heavily is always going to probably get you into trouble with certain ways And spotty network connections and stuff, So I would say, use polyfills use, progressive enhancement, use what the web platform offers you and use JavaScript responsibly Yeah.

It’S really great to hear, especially from a Googler that, like reducing reliance on JavaScript and Taking advantage of good HTML and CSS, where it’s available can can actually wonders for your SEO. Absolutely Ada. Thank you so much for being here and talking to me about performance and SEO, and I do you have a feeling that SEO and web developers can work together nicer or is there still…? I think, as long as the goals of what people are trying to accomplish are clear and we’re not just like resorting to auguries or looking at the stars to work out what Google is thinking, then it’s going to Enabled developers to actually build sites that make sense And take advantage of that platform, Anything Google can do to ensure that the web works for everyone and not just in the wealthy Western web, then It’ll be really really fantastic.

Fantastic closing words. Thank you so much for being here. Thank you This just in the next episode of SEO. Myth-Busting is going to be about SEO in the age of frameworks, Jason Miller, and I will talk about what that entails. So stay tuned on this blog Subscribe to Google webmasters and see you soon.


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.