Categories
Online Marketing

Essential JavaScript SEO tips – JavaScript SEO

Let’s look at a few SEO techniques to help users find your content. All of your pages should have a descriptive and helpful title that describes what the page is about in very short terms, for example, on recipe pages avoid using a generic titles such as Barbara’s baking block.

Instead, each page should have the name of the recipe in the title, so it’s clear what the page is about. You also should provide a description of what the page will contain, specifically, for example, what makes this recipe special on what are its main characteristics? So people have something helping them identify the best page to fulfill the intended goal. Both of this can be done by adding a title and meta tags in your markup.

You can check your pages for those tags by using the right-click inspect and then search for double slash, title and double slash meta to find them. If you do not see all of your content in the markup, you are probably using javascript to render your page in the browser. This is called client-side rendering and is not a problem per se. Rendering is the process of populating templates with data from api’s or databases.

This can happen either on the server side or on the client side, when it does happen on the server crawlers, as well as your users get all the content as HTML markup immediately in single page apps. The server often sends the templates and Java scripts to the client and the JavaScript then fetches the data from the backend populating the templates as the data arrives, as explained in the first episode, the indexing for JavaScript sites happens in two waves content.

That requires JavaScript to be fed, we’ll only be indexed in the second wave, which might take some time in later episodes. We will cover how to overcome this and often also improve the user experience and loading performance by using techniques such as dynamic, rendering hybrid rendering or server-side rendering for single page apps. Another important detail is to allow Googlebot to crawl pages from your website by linking between your pages properly, make sure to include useful link anchor text and use the HTML anchor tag with the destination.

Url of the link in the href attribute do not rely on other HTML elements such as div or span or use javascript event. Handlers for this now only crawlers will have a trouble finding and following these pseudo links, they also cause issues with assistive technology. Links are an essential feature of the web and help search engines and users find and understand the relationship between pages, if you are using javascript to enhance the transition between individual pages use.

The history API, with normal URLs, instead of the hash based routing technique using hashes, also called fragment identifiers to distinguish between different pages, is a hack that crawlers to ignore using the JavaScript history. Api, on the other hand, with normal URLs, provides a clean solution for the same purpose. Remember to test your pages and server configuration when using javascript to do the routing on the client-side.

Googlebot will be visiting your pages individually, so neither a service worker nor the JavaScript using history. Api can be used to navigate between pages test. What a user would see by opening your l’s in a new incognito window, the page should load with an HTTP 200 status code and all the expected content should be visible. Using semantic HTML markup properly helps users better understand your content, as well as navigated, quicker assistive technologies like screen.

Readers and crawlers also rely on the semantics of your content, use headings, sections and paragraphs to outline the structure of your content, using HTML image and article tags with captions and alt text to add visuals. You have crawlers and assistive technology to find this content and surface it to your users. In contrast, if you use JavaScript to generate your marker, dynamically make sure you aren’t accidentally blocking Googlebot in your initial markup, as explained in the previous episode.

The first round of indexing does not execute JavaScript having markup such as a no index meta tag in the initial payload can prevent Googlebot from running the second stage with JavaScript. Following these steps will help Googlebot understand your content better and make your content more discoverable in Google search, hi Googlebot hi Googlebot. Did you see the new webmasters article serious? No, I did not what you missed out on so much stuff.

Really, yes, oh no. How could we have prevented that well subscribe and follow our articles?


 

Categories
Online Marketing

Multiple H1 Headings: How to Handle Them for SEO & Accessibility? #AskGoogleWebmasters

Today’s question is from Marcus Chaepelli Marcus is asking. Can we have a clear answer to the question how to handle headings and accessibility? I see a lot of multiple H1’s ( all, but one are usually hidden ) out there on the web. Everybody treats it differently and stuff like the tag. So this is a pretty common question and it’s pretty straightforward.

Our systems don’t have a problem when it comes to multiple H1 headings. On a page, That’s a fairly common pattern on the web. We use headings to better understand the context of different parts of a page Having clear semantically. Understandable headings is useful in understanding any given page. However, We have to work with the web as we find it and a lot of it isn’t semantically structured at all For users.

The difference is minimal. Both kinds of pages can be extremely relevant to a question that they have In turn. Our systems aren’t too picky and we’ll try to work with the HTML as we find it, be it one H1 heading multiple H1 headings or just styled pieces of text without semantic HTML at all. In Short, when thinking about this topic SEO shouldn’t be your primary objective. Instead think about your users, If you have ways of making your content accessible to them, be it by using multiple H1 headings or other standard HTML constructs, then that’s not going to get in the way of your SEO efforts.

I hope you found this answer useful And, if there’s anything else, we can answer for you. In short, article form, please send us your questions using the # AskGoogleWebmasters hashtag on Twitter, so that we can include them in one of the future articles To stay in the loop make sure to subscribe to the blog and see you next time.


 

Categories
Online Marketing

JavaScript: SEO Mythbusting

Where do these come from? How do these get into the world The myths, the legends that come through now JavaScript? I think a lot of it is people with very good intentions will try to provide the information they have available and there’s a gap in translation between the SEOs and the developers and how they think and what they consider So by going ahead and adopting is acceptance Criteria as part of my tickets, when I work with devs that lets them know very specifically, instead of being like “, and I want you to make magic for me” And you go from “.

Give me magic” to “, hey here’s, my user story,.” “. I would like to accomplish three pieces for acceptance criteria..”, You can bridge the gap, Hello and welcome to another episode of SEO. Myth busting With me today is Jamie Alberico Jamie. What do you do in your job? Thank you so much for having me here. I’M a technical SEO with Arrow Electronics. That means that I am embedded with a number of dev teams across a number of projects, And we tried to execute these initiatives, get new features available on the site in an effective and search friendly way, And that means a lot of times.

We have to have conversations about how we’re using our JavaScript Having you here is fantastic, because then we can have a conversation about pretty much everything that you want to know from the search side, as well as the web developer side. So… Any questions that you have in mind or anything like pops into your mind. Oh so many questions. I hope I get to poke at the black box of Google here And I have one.

That’S absolutely burning Is JavaScript the devil. That’S a fantastic question. It might seem that way, sometimes, especially when things are not going the way you want. You see the horror stories They’re on forums or on Twitter. Everything is gone. Yeah, that’s one thing: That’s the SEO site on the developer site is also like. Oh, it’s a language that wasn’t designed to be like super resilient, But it actually is and then often people are oh, It’s a C style type language and it’s not really.

It’S a list type language They’re, like a lot of misconceptions, coming from both worlds together and clashing here. I don’t think it is the devil. I think it has its benefits. I mean it allows us to build really cool and fantastic stuff on the web and be really responsive to what the user does and wants to do with our applications. And it has moved the web from becoming or being a document platform towards an application platform.

And I think that’s fantastic, So I think we are already pushing hard on fighting this “ JavaScript is the devil,.” And “. If you use JavaScript, we can’t be indexed at all.”. So that’s not true for for a long time, But I think now the documentation is catching up with like outlining the different Bits and pieces that you should be aware of and the features that you have to deal with that are not available.

One thing, for instance, is you probably have built single page applications right? Oh, yes, Has there been problems in terms of SEO when they rolled out, I I was pretty lucky. I had a dev team who believed in SEO. That’S good, That’s really good. That was actually my the big moment of my career when I got on the technical SEO And I came and I talked to you one of my new developers for the first time with this very specific problem I was trying to solve and he just paused and Looked up from his keyboard and went “ you’re, not snake oil”, So I think we’re making a lot of progress between SEO and devs.

That is fantastic. It’S a great story, So you might hear a few people in in the community going like ooh. Should we do a single page application? Is that risky And one of the things that a bunch of developers are not aware of, and some SEOs are not necessarily communicating all the time is that we are stateless. So that means with a single page application. You have a bit of an application state right, You know which page you are looking at and you how you transition between these pages.

However, when a search user clicks on a search result, They are not having this application. They are jumping in right to the page that we indexed, so we only index pages that can be jumped right into So a lot of the technology. Javascript technology is making assumptions of how The user navigates so the application. So like the developer as a developer. In my test, It’s okay, Here’s my application.

I click on the main navigation for this particular page and then I click on this product and then I see and everything works, But that might not do the trick because… You need that unique URL. It has to be something we can get right to Not using a hashed URL and also the server needs to be able to serve that right away. If I, if I do this journey and then basically take this URL and copy and paste it into an incognito browser Mm-hmm, I want people to see the content, Not the home page and not a 404 page.

So that’s something that we’re working on giving more guidance for lazy loading. You probably have seen a bunch of communication about that. One is probably Yes Yeah. How do we get a rich media experience out to users, but do it in a way where, if you’re on your cell phone, we keep that very small time frame? We have to get your attention Correct and you want to make sure that if you have a long list of content, You don’t bring everything into the especially on the cell phone right, Just feeling, like 100 images, What about Ajax? What about using asynchronous, JavaScript and XML? That is perfect Whoa I haven’t, I haven’t, heard Ajax being used in a while, and it’s fell out in a while.

I mean Everyone’s using it, but no one’s talking about it that much It was just like yeah. You just load data in as you go and that’s perfectly fine. We are able to do that. Also, I often get us about how that affects the crawl budget.., Let’s talk So what worries you about that? Well, if we’re using Ajax and me requests, say a product detail page and we’re using Ajax to supplement a lot of pieces of content to it.

Right, Googlebot’s requested one URL and it’s gotten back nine Yeah, because each of those Ajax calls had a unique string right. How do we handle that and does that negatively impact our crawl budget? So I wouldn’t say it negatively impacts your crawl budget, because crawl budget is much more complex than you might see this It’s one of these things that looks like super simple, but there’s more than meets the eye, We’re doing a bunch of caching right, because we expect That content doesn’t necessarily like update too much.

So Let’s say you have this product page. You make one request to the product page and then that makes nine more requests. We don’t make it. We don’t distinguish between like loading, the CSS or the JavaScript, or the images or the API calls that get you the product details. So if you have nine calls from this one page load, then that’s going to be ten in the crawl budget. Because of caching, we might have some of these in the cache already, So if we have something that is already cached, that doesn’t count towards your crawl budget.

So if we were to version our Ajax calls, yes, those could be cached as those could be cached exactly. Yes and then that’s that’s one way of working around it. If you can do that, if that’s a possibility, The other thing is, you could also consider it not just an issue for The crawl budget, but also an issue for the user right, because, if you’re on a slow network or spotty network connection, It might flake out In the middle – and you were your left foot broken content, That’s not a great user experience.

You want to probably think about, like pre-rendering or hybrid, rendering or server side, rendering Anything in between there And crawl budget is tricky generally, because we are trying to deal with the whole “ host load” situation. So what can your server actually deal with? So we are constantly adjusting that anyway, So it’s like “, oh this affected our crawl budget negatively.”, Not really because we just like had host load issues with your server, So we like adjusted it anyway, so we had balancing issues across your entire content.

So I wouldn’t say that it’s not much of a deal, But I see that it’s very important for people to understand that and unfortunately that’s not that easy. Can we demystify Googlebot a little bit Because we have this, The omnibus, the great the Googlebot, but it actually goes through a series of Actions. So we get that initial HTML parse. We find that the JavaScript and CSS that we need to go ahead and make our content then call those pieces.

We know. Since Google I/O there is actually a gap between our initial parse and our HTML rendering. But I want to know more because Googlebot follows HTML. / HTML5 protocols – Yes, There’s some nuances there. I don’t think I know I didn’t know about Where say: you’ve got an iframe in your head and you’ve got a closing head script right there That ends your head for Googlebot Yeah. All of our lovely meta content, our hreflangs and canonical’s below that have a tendency to exist.

.. That is true, there’s a bunch of things at play. So when we say Googlebot what we actually mean on the other side of the curtain is a lot of moving parts. So There’s the crawling bit that literally takes in URLs right and then caches them from the server, then so that when you are providing the content to us, we get like the raw HTML. That tells us about the CSS, the JavaScript And the images that we need to get and also the links in the initial HTML yeah, and because we have that already we have such a wealth of information already.

We can then start it like go off and fetch the JavaScript and everything that we need to render later on, But we can also already use the HTML that we’ve got and say like “. Oh look, there’s links in here that need to be crawled.”. So when you have links in your initial HTML, we can go off and basically start the same process for these URLs as well. So a lot of things happen in parallel, rather than just like one step and then the next step, and then the next step.

So this is definitely the start of it And as we get the HTML in parallel to extracting the links and then crawling these, we queue them for rendering. So we can’t index before we have rendered it, because a bunch of content needs to be to be rendered. First, In a way that better fits us, if we’ve got a single page application, We now.. Googlebot has the template. They just got to grab the content that fits within there Yeah.

So, wouldn’t that mean that Googlebot likes these JavaScript platforms, The more content you get us quickly in the first step in the crawling step, the better it is because we can then basically carry that information over rather than having to wait for the rendering to happen, But Is prerender always the best solution? That’S a tricky one. I think most of the time. It is because it has benefits for the user on top of just the crawlers, But you have to very carefully measure what you’re doing there, I think so Giving more content over is always a great thing.

That doesn’t mean that you should always give us a page with a bazillion images right away, because that’s just not going to be good for the users, Because they’re going to have to then…. If you’re on a really old phone. And I have a pretty old phone and you have a pages full of images and transitions and stuff, then you’re like.. “. I can’t use this website.”. So pre-rendering is not always a great idea.

It should be always a mix between Getting as much crucial content and as possible, but then figuring out which content you can load lazily in the end of it. So for SEOs. That would be. You know we. We know that different queries are different. Intents Informational, transactional,…, so elements critical to that intent should really be in that initial rush Exactly and you might consider if, if the intents are wildly different And the content is very, very different, consider making it into multiple pages or at least multiple views if you’re Using a single page Application so that you have an entry point for the crawler to specifically point at it when when it comes to surfacing The search results, So treat it like a hub and let the users branch out from there.

Yes, so that’s where we’d use! Maybe our CSS toggle for visibility. That is a possibility just having different URLs, is always an option, especially with the history API. You can probably in the single page application figure out which route to display and then like have the content separated between different routes or Be a little more dynamic. There.. We support parameters, So even if you use URL parameters.

. Basically expose the state that is relevant to the user in the URL. What other ways does that benefit our users, because our ultimate goal is to make them happy And that’s our ultimate goal too. So like we are, we are the same in terms of what our goal is. We both want to surface useful information to the user as quickly as possible, So The users benefits are especially if you do like hybrid rendering or the server-side, rendering that They get the content really quick.

Normally, if it’s done well, if it’s not overloading their device And they get to jump in right where the meaty bits are right. So if I’m looking for some specific thing – and you give me a URL that I can use to go to that specific thing – I’m right there and I’ll have a great time, because it’s the content that I needed So yeah. If you have performance metrics going up as well, then, even if I’m on a slow phone and a really spotty network, I still get there.

I mean our performance metrics, that’s based on a lot of pieces. We have a stack of technology. That is true. What should SEOs look for in our stack? Where should we try to identify those areas where we could have a better experience for not just Googlebot, but our humans Yeah? So I think a bit that is oftentimes overlooked, not by a SEOs But by businesses and developers, is the content part. So you want to make sure that the content is what the users need and want, and it’s written in a way that helps them.

But on the technology side,… Wait So that blurb at the top people always do where the like. Here’S my hero image and then 500 words about this thing And I’m a human who wants to buy something and there’s so much stuff in the way…. Yeah. Don’T do it At least like have two pages have like the promotional page that you want to do direct marketing towards and then, if I specifically look for your product, just give me your product.

Just let me let me give you money, So I think Talking about performance and all the different metrics, it’s a bit of a blend of all the things like Look at. When does my my content actually arrive, when does my page become responsive? So you look at First content for pain. You look at time to first buy as well less important than the first content full paint. I would say, because it’s fine, if it takes a little longer.

If, then, the content is all there Versus.. So time to first byte can take a bit of a hit Yeah If we deliver that faster first, meaningful paint Exactly because in the in the end as a user, I don’t care about. If the first byte has arrived quicker, if I’m Still looking at a blank page because javascript is executing or something is blocking a resource… If it arrives a little later, but then it’s right there, That’s fantastic right and you can get there in multiple ways.

I highly recommend testing testing testing testing. What testing tools would you recommend? So I definitely recommend lighthouse. That’S a great way. Web hint is more More broad approach as well, and you could also use PageSpeed insights or the new SEO audits in lighthouse. Mobile-Friendly test also gives you a bunch of information. Pagespeed insights is look at that full page though Mm-hmm and we had a bit of a bit of a gap.

We have almost this futurist Lighthouse, where we want that time to interactive, and then we have people adopt this methodology. That’S how we got you know so much contact via Ajax, because full page load is fast, but all that content was still coming…. I would recommend lighthouse that gives you like the Filmstrip view of when things are actually ready for the user to work with. So I would highly recommend looking at lighthouse but PageSpeed insight Gives you a good like first first view over and it integrates with lighthouse really nicely now.

Wonderful. Do you think that JavaScript and SEO can be friends now and developers and SEO s can also work together? I do I really think that You know if Google is a library and a webpage is a book using these JavaScript frameworks. Lets us make pop-up books, enrichen experiences to engage with. Oh that’s a fantastic analogy. I love that image. That’S a that’s a beautiful one! Thank you so much Jaime.

Thank you very much and I hope you enjoyed it and see you next time. Have you ever wondered where, on the map, you should put UX and performance when you’re talking about SEO, So have I Let’s find out in the next SEO? Myth-Busting episode,


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.

 

Categories
Online Marketing

Web Performance: SEO Mythbusting

Myth busting with me today Is Ada Rose, Cannon and you’re working for Samsung. Is that right? What do you do at Samsung? So for Samsung? I’M a developer advocate for the web browser Samsung Internet Samsung Internet is a web browser for Android phones.

You can get it from the Play Store, but not a lot of people have heard about it. So there’s lots. What I do is trying to raise awareness, but more importantly than that, What I’m trying to do is advocate for the web as a platform to try and encourage developers to build for it and to Make sure it lasts long into the future. As a great and healthy platform for people to build stuff with, I love to have you here, because I want to talk to you about the SEO Versus performance and usability on the web, and I think we need to get some stuff out of the way right.

So will you say what is the most important bits and pieces that you would like people to focus on more when building web stuff? So I have a huge passion for ensuring that That the web remains great for everyone around the world, Not just on people using the latest handsets and on desktop computers, Because most people aren’t people are using devices from years ago and low-end sub $ 100 devices, where, Frankly, today, the the modern web is just not even reaching them There’s a fantastic talk from Alex Russell Who goes into the the reality of people with phones that are less than $ 100 And yeah.

That’S a that’s! A fantastic one. You’D have the naive thought that, as time goes on, that phones are getting steadily better and at a bottom of the line phone is Nowadays is just as good at the top of line phone four years ago, when they’re, not It’s just getting wider and wider Wider. The chasm is opening, rather than anything else, What was really awesome. I heard recently Google was doing performance metrics into Into their ratings for search results.

I don’t So was this front-end web performance like render speed, making sure it’s not janky, or is this just Making sure that a page loads really quickly? So it is a little… It’S a it’s a tricky one, because we have so many metrics right. We have Time to first bite. We have time to interactive, We have time to first meaningful paint, and then you have like the frame rates and stuff.. Now Googlebot, which is the tool that basically fetches the data and renders your website for for Search Indexing.

We don’t really interact that much with the page, So we can’t really get like figure out if your scroll is smooth on something like that, but we do get the rendering bits. So we can tell you, when the page becomes responsive to inputs, when the content is ready for the user to consume, So we’re looking at the blend of these kind of modes Yeah of performance. Does that make sense? It does make sense.

So do you have any other qualms with like how SEO influences the daily work of a web developer? So a friend of mine recently rebuilt her site using React. She was very excited about it and and seemed to get quite good to client side performance Once it all loaded. Unfortunately, when she sent it out to her company’s Team to which does SEO analysis, they came back with an answer of. We love your site.

It’S really good, But you’ve basically don’t appear in the rankings, even though she could show them that look right there. It’S on Google Is Google engaging with people who do SEO analysis to ensure that They’re running up-to-date metrics, the similar ones to Google to ensure that even a heavily client-side rendered page, and They can feel confident that it is being measured. Well, So we can’t really Fix what people are doing in terms of what they were tools.

They’re using or something, But what we do want is we want to open this black box of SEO for everyone, So we’re having this conversation web developers We’re having this conversation with Seo and tool makers and we provide a bunch of metrics and tools as well. So we have search console that gives you a bunch of insights and how you’re doing in search so that you’re, not relying on someone else. Basically, sticking the finger in the wind and Reading the stars and stuff, and we also.

We also want to make sure that people are understanding that blanket statements like JavaScript’s going to kill your SEO or you cannot use, React or Angular that that’s not necessarily the best way of doing it. It’S a really comfortable answer, probably mmm. It’S not the right answer. Sometimes All right so at Chrome, Dev Summit. I saw your great talk on SEO in the web. Thank you, And one thing you mentioned was the the rendering for By Googlebot to actually process a JavaScript heavy site could take up to a week to happen.

Does this mean that JavaScript heavy sites are effectively getting penalized in Google Search results right, They’re, not getting penalized, so they are ranking just fine, but the indexing stage is where the problem is because, as you say, we are processing by putting them first into a rendering Queue and then eventually, As we have the resources available, We are rendering them and if the resources take a while to actually render That means that we cannot refresh the content in the index as quickly so News sites might want to look into that.

But then again you have usability issues anyways right, Yes right and that’s because that’s bad for the user. We try to find search results that are good for the users and If a page takes ages to load. That is not a good experience for me. So you want to fix that because of the users, not necessarily just because of the crawler. So if a page is Built using, I know I have a bit of a bias against these JavaScript heavy front-end client-side Rendered pages, because they’re terrible for everyone who doesn’t have like an iPhone or latest Pixel, or something Yeah or a desktop computer.

But anyway, for these sites, if the way they make their money is delivering fresh content daily. Does this mean that the content in the search results may actually be like out of date? For them They might be lagging? Then? Yes, absolutely And I think again, like it’s very important to get the users a great experience, and I don’t think you can do that when you are Heavily relying on client-side rendering, because good Devices might be really old, So yeah one way of working around.

Unless you want to Properly fix this and do hybrid, rendering or server-side rendering One way around of that is to do dynamic, rendering and basically like give us a static rendered version of your page for the crawler So that we can index it quicker. But that’s not making the user usability and user experience problems going. So what do you say it’s generally safer to rely more on on latest HTML and CSS, Knowing that they degrade more gracefully than JavaScript? Yes, Don’t speak! If you look at the the tristar of Technology that we have in the web platform like HTML, CSS and JavaScript, HTML and CSS are just more resilient than JavaScript, and so Relying on JavaScript, too heavily is always going to probably get you into trouble with certain ways And spotty network connections and stuff, So I would say, use polyfills use, progressive enhancement, use what the web platform offers you and use JavaScript responsibly Yeah.

It’S really great to hear, especially from a Googler that, like reducing reliance on JavaScript and Taking advantage of good HTML and CSS, where it’s available can can actually wonders for your SEO. Absolutely Ada. Thank you so much for being here and talking to me about performance and SEO, and I do you have a feeling that SEO and web developers can work together nicer or is there still…? I think, as long as the goals of what people are trying to accomplish are clear and we’re not just like resorting to auguries or looking at the stars to work out what Google is thinking, then it’s going to Enabled developers to actually build sites that make sense And take advantage of that platform, Anything Google can do to ensure that the web works for everyone and not just in the wealthy Western web, then It’ll be really really fantastic.

Fantastic closing words. Thank you so much for being here. Thank you This just in the next episode of SEO. Myth-Busting is going to be about SEO in the age of frameworks, Jason Miller, and I will talk about what that entails. So stay tuned on this blog Subscribe to Google webmasters and see you soon.


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.

 

Categories
Online Marketing

SEO Mythbusting 101

I don’t know about people say a lot of things out there about how to make your website it stand in the top result, but I don’t really know how to achieve that. You know right, fair enough. That’S a that’s a really good question: how to achieve that and I think that’s a perfect introduction into what we’re trying to do here we’re trying to like bust these myths.

What can I help you with? What are the questions that come to your mind? Okay? So, let’s start with something simple: what it’s a search engine all right, so a search engine is a platform of service or program whatever you want to call it that basically goes through the internet content and tries to catalog it. It works a little bit like in the library right, so you probably go to a library and ask the librarian.

Where can I find a book on topic X right, that’s what you do and then normally it doesn’t take you to basically go through all the books. In the library you just ya, get the right books and that’s what search engines do for you. We find the right content for your purpose, all right, but I went when he’s heard of search engines. I also heard this word called crawling. Is that a thing? That’S a thing so the way that we are doing this or search engines do this is by first going through the entire internet and we have links from one page to the other yeah.

So we are using that we start somewhere some URLs and then basically follow links from there on. So we are basically crawling our way through the Internet, one page by page, more or less, and then once we have, these pages have found them have grabbed the content. From the Internet, we need to understand it. We need to figure out what is this content about and what purpose does it serve? So then that’s the second stage which is indexing.

So then we figure out. So this page is about ice cream. This page is about ice cream in Miami. This page is about marmalade and stuff like that, and then the last step is, if you type something in you don’t type in. I want this particular thing here. You just go like. I need ice cream ice cream online. Midian right, yes, you got it, so we then basically look into our index and find the ones that are serving this purpose.

And then we try to figure out, which is the one that serves these purposes perfectly or best, and then we rank these higher than the others and show you the example, the examples that we found from the index. So how do you know which one is which results are more relevant to a given user? That’S a really good question. We have over 200 signals to do so. So we look at things like the title: the Meta Description, the actual content that you’ve got on your page images links all sorts of things.

Well right, it’s a very complicated question to answer what ranks you best but yeah. We look at the bunch of signals now, if you could give me like. You know, like top three things, that I should consider. What would that be right, so us being developers originally, you probably want me to say. Oh, I use this framework or use that framework. Yeah, that’s not how it works. You have to have really good content, and that means you have content have to have content that serves a purpose for the user.

It’S something that users need and or one optimally they need it and want it. Okay, like ice cream, so if you’re, if your content says where you are, what you do, how you? How you help me with what I’m trying to accomplish? That’S fantastic! If you just have a page that says like we are a fantastic company and we have plenty of products, that’s not serving a purpose, so you want to make sure to serve the purpose of the people who you want to attract and get who you want to Interact with your content, and you want to make sure that you’re using words that I would be using if you use a very specific term for your ice cream, let’s say like smooth cream 5000 or something like that.

I’M not I’m not going to search for that because I don’t know about I’m just going to go like I need ice cream, it’s good to mention it somewhere, so that I know. If I look for that trademark, I find it as well okay, but if I, if I’m exploring ice cream around me, I don’t know what particular ice cream there is, if there’s like a specific brand fantastic, but that’s not what I’m looking for so speak the language That I’m using so you’re you’re saying more.

Like a page, it’s like an exactly you wouldn’t when when we to meet and you have a fantastic product or I have a fantastic four, I wouldn’t go like yeah blurp master 5000. It’S fantastic and you’re like yeah. It doesn’t say it does that do all right, so do that, do an elevator pitch and help us. Okay, put you in contact with the right people, so content is number one priority. Oh, could you mention another two things that are important for this yeah you’re going to love them because they are technical, so the second biggest thing is make sure that you have meta tags that describe your content.

So I have a Meta Description. Okay, because that gives you the possibility to have a little snippet in the search results that let people find out, which of the many results might be the ones that help them the best and have page titles that are specific to the page that you are serving. So don’t have a title for everything. The same title is bad. If you have titles that change with the content you’re showing that is fantastic and frameworks, have ways of doing that so consult the documentation, but there’s definitely something something that helps with the content and the last bit is performance.

Herot right, yeah performance is fantastic. We’Re talking about it constantly, but we’re probably missing out on the fact that this is also good for being discovered online. Our so performance is not just making my website faster, but it’s also making my website more visible to others, correct okay, because we want to make sure that the people clicking on your search was like clicking on your page yeah, getting this content quickly.

So that’s one thing that we want to make sure as well, so we’re it’s one of the many signals that we are looking at, but also it just helps you use this right. They get happier. If I want ice cream really badly, then I get the page quicker, that’s fantastic yeah! So if you want to look at performance, I highly recommend looking into hybrid rendering or server-side rendering again, because that gets the content quicker to the users.

Usually right also, you might have BOTS that don’t run JavaScript so Googlebot. Does that, but not everyone else. Does it necessarily? So you want to make sure to probably figure out something like dynamic rendering, if you don’t want to make code changes, because I understand we’re all pressed for time. We have lots of bugs and and features too to fulfill and work through. So if you can’t change the code dynamic, rendering might be something that gets you there.

Okay, if there’s rendering shoes with your content. But besides that, I would say definitely look into performance. Optimization get the content quicker, get the first content full paint in there quicker optimize. Your servers optimize your caching strategies make sure that your script doesn’t have to run for, like 60 seconds, to fetch everything that you need. I know yeah, so those are things that you should definitely look into, and I guess performance is something that pretty much everyone in the developer community is looking at.

Certainly yes or they should at least they should. I hope that they do okay, so we already discussed, like all these basics around SEO and search engines and how to position my my website in the top search results. Now the question is: why is it so important for companies to rank like like in the top results right, so you’re you’re a web developer right? Yes, your build stuff on the internet. Yeah.

Do you want people to use it? Certainly, yes, certainly right, so in order to make sure that people can use that they have to know about it, and unless you are probably one of the really big players might not, and even for the big players, if they launch something new, you might not know About it and you’re not looking specifically for products you’re looking for something that serves a purpose for you, okay, I want to know how I built this thing with a framework I want to know where to find the best ice cream and the place I am in.

I want to find the cutest dogs and poppers online so, like I have a purpose, I don’t know who serves this purpose necessarily. So if you build the best ice cream, PWA ever in, let’s say Medellin. Is that how you profess? So if you build the best PWA to order ice cream online in midian, then I don’t. I don’t know about that, especially if I come as a tourist. But if I type that into a search engine like order ice cream in medicine, and then it goes like hey this, this PWA does this trick yeah you want to be the the first or the first couple of because I’m not going to go to page 99 And go like oh yeah.

This might be the perfect thing, because Google and other search engines are trying to like figure out what is the best for this purpose and then show me those up front and then I might pick from those because normally they’re pretty good. I think that covers have all the questions I have fantastic, so you feel like ready to build that, certainly excellent. That is so cool. Thank you. So much for being here.

Thank you, my guests, and I hope that this this helps other developers as well and developers and se owes can be friends. I think I think so yeah I think so. Thank you. Oh, are we still on please stay tuned for another episode of SEO: myth busting. Next time with soos Hinton we’ll talk about what is Googlebot so come back again and read what happens?


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.

 

Categories
Online Marketing

Essential JavaScript SEO tips – JavaScript SEO

Let’S look at a few SEO techniques to help users find your content. All of your pages should have a descriptive and helpful title that describes what the page is about in very short terms, for example, on recipe pages avoid using a generic titles such as Barbara’s baking block.

Instead, each page should have the name of the recipe in the title, so it’s clear what the page is about. You also should provide a description of what the page will contain, specifically, for example, what makes this recipe special on what are its main characteristics? So people have something helping them identify the best page to fulfill the intended goal. Both of this can be done by adding a title and meta tags in your markup.

You can check your pages for those tags by using the right-click inspect and then search for double slash, title and double slash meta to find them. If you do not see all of your content in the markup, you are probably using javascript to render your page in the browser. This is called client-side rendering and is not a problem per se. Rendering is the process of populating templates with data from api’s or databases.

This can happen either on the server side or on the client side, when it does happen on the server crawlers, as well as your users get all the content as HTML markup immediately in single page apps. The server often sends the templates and Java scripts to the client and the JavaScript then fetches the data from the backend populating the templates as the data arrives, as explained in the first episode, the indexing for JavaScript sites happens in two waves content.

That requires JavaScript to be fed, we’ll only be indexed in the second wave, which might take some time in later episodes. We will cover how to overcome this and often also improve the user experience and loading performance by using techniques such as dynamic, rendering hybrid rendering or server-side rendering for single page apps. Another important detail is to allow Googlebot to crawl pages from your website by linking between your pages properly, make sure to include useful link anchor text and use the HTML anchor tag with the destination.

Url of the link in the href attribute do not rely on other HTML elements such as div or span or use javascript event. Handlers for this now only crawlers will have a trouble finding and following these pseudo links, they also cause issues with assistive technology. Links are an essential feature of the web and help search engines and users find and understand the relationship between pages, if you are using javascript to enhance the transition between individual pages use.

The history API, with normal URLs, instead of the hash based routing technique using hashes, also called fragment identifiers to distinguish between different pages, is a hack that crawlers to ignore using the JavaScript history. Api, on the other hand, with normal URLs, provides a clean solution for the same purpose. Remember to test your pages and server configuration when using javascript to do the routing on the client-side.

Googlebot will be visiting your pages individually, so neither a service worker nor the JavaScript using history. Api can be used to navigate between pages test. What a user would see by opening your l’s in a new incognito window, the page should load with an HTTP 200 status code and all the expected content should be visible. Using semantic HTML markup properly helps users better understand your content, as well as navigated, quicker assistive technologies like screen.

Readers and crawlers also rely on the semantics of your content, use headings, sections and paragraphs to outline the structure of your content, using HTML image and article tags with captions and alt text to add visuals. You have crawlers and assistive technology to find this content and surface it to your users. In contrast, if you use JavaScript to generate your marker, dynamically make sure you aren’t accidentally blocking Googlebot in your initial markup, as explained in the previous episode.

The first round of indexing does not execute JavaScript having markup such as a no index meta tag in the initial payload can prevent Googlebot from running the second stage with JavaScript. Following these steps will help Googlebot understand your content better and make your content more discoverable in Google search, hi Googlebot hi Googlebot. Did you see the new webmasters article serious? No, I did not what you missed out on so much stuff.

Really, yes, oh no. How could we have prevented that well subscribe and follow our articles?


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.