Categories
Online Marketing

Introduction to Service Workers

You’Ll learn what a service worker is and what it can do for your apps. A service worker is a client-side programmable proxy between your web app and the outside world. It gives you fine control over network requests. For example, you can control the caching behavior of requests for your site HTML and treat them differently than requests for your site’s images.

Service workers also enable you to handle push messaging now. Service workers are a type of web worker, an object that execute the script separately from the main browser thread. Service workers run independent of the application they are associated with and can receive messages when not active either, because your application is in the background or not open or the browser is closed. The primary uses for a service workers are to act as a caching agent to handle network requests and to store content for offline use and, secondly, to handle push messaging.

The service worker becomes idle when not in use and restarts when it’s next needed. Now, if there is information that you need to persist and reuse a course restarts, then service workers can work with indexdb databases. Service workers are promised based now we cover this more in other materials, but at a high level a promise is an object. These are the kind of placeholder for the eventual results of a deferred and possibly asynchronous computation service workers also depend on to api’s to work effectively fetch a standard way to retrieve content from the network and cache a persistent content storage for application data.

This cache is persistent and independent from the browser, cache or network status now because of the power of a service worker and to prevent man-in-the-middle attacks where third parties track the content of your users. Communication with the server service workers are only available on secure origins served through TLS using the HTTP protocol will test service workers using local host, which is exempt from this policy.

By the way, if you’re hosting code on github, you can use github pages to serve content. Their provision with SSL by default services, like let’s encrypt, allow you to procure SSL certificates for free to install on your server Service Worker, enabled applications to control network requests, cache those requests to improve performance and to provide offline access to cached content. But this is just the tip of the iceberg.

We will explore some things you can do with service workers and related api’s caching. Assets for your application will make the content load faster under a variety of Network conditions. Two specific types of caching behavior suitable for use are available through service workers. The first type of caching is the precache assets during installation. If you have assets, HTML, CSS, JavaScript images so on, and these are shared across your application.

You can cache them when you first install the serviceworker when your web app is first opened. This technique is at the core of application. Shell architecture now note that using this technique does not preclude regular dynamic caching, you can combine the pre cache with dynamic caching. The second type of caching is to provide a fallback for offline access using the fetch API inside a serviceworker.

We can fetch request and then modify the response with content other than the object requested use this technique to provide alternative resources in case the requested resources are not available in cache, and the network is unreachable. Service workers can also act as a base for advanced features. Service workers are designed to work as the starting point for features that make web applications work like native apps, and some of these features are blog messaging API, which allows web workers and service workers to communicate with each other and with the host application examples of this Api include new content notifications and updates that require user interaction.

The notifications API is a way to integrate push notifications from your application to the operating system native notification system. The push API enables push services to send push messages to an application service can send messages at any time, even when the application or the browser is not running. Push messages are delivered to a service worker which can use the information in the message to update local state or display a notification to the user background.

Sync lets you defer actions until the user has stable connectivity, and this is really useful for ensuring that whatever the user wants to send is actually sent. This API also allows servers to push periodic updates to the app, so the app can update when its next on line. Every service worker goes through three steps in its lifecycle, registration, installation and activation to install the service worker.

You need to register it in your main JavaScript code. Registration tells the browser where your service worker is where it’s located and to start installing it. In the background, for example, you could include a script tag in your site’s index.Html file or whatever file you use. Is your applications entry point with code similar to the ones shown here? This code starts by checking for browser support by attempting to find Service Worker as a property in the navigator object.

The service worker is then registered with navigator dot Service Worker dot register, which returns a promise that resolves when the service worker has been successfully registered. The scope of the service worker is then logged with registration, dot scope. You can attempt to register a service worker every time, the page loads and the browser will only complete the registration. If the service worker is new or has been updated, the scope of the Service Worker determines from which path the service worker will intercept requests.

The default scope is the path to the Service Worker file and extends to all directories below it. So if the Service Worker script, for example, Service Worker dot gif, is located in the root directory, the Service Worker will control requests from all files at best domain. You can also set an arbitrary scope by passing in an additional parameter when registering in this example. We’Re setting the scope of the Service Worker to slash app, which means the service worker will control requests from pages like slap slap, slash, lower and slash out, slash, lower slash low directories like that, but not from pages like slash, app or slash, which are higher a Service worker cannot have a scope above its own path.

This is in your service worker file, service worker, dot, j s now thinking about installation. Once the browser registers a service worker, the install event can occur. This event will trigger if the browser considers the service worker to be new either, because this is the first service worker encountered for this page or because there is a bite difference between the current service worker and the previously installed one.

We can add an install event handler to perform actions during the install event. The install event is a good time to do stuff, like caching, the apps your static assets using the cache API. If this is the first encounter with the service worker, for this page, the service worker will install and if successful, transition to the activation stage upon success once activated, the service worker will control all pages that load within its scope and intercept corresponding network requests.

However, the pages in your app that are open will not be under the serviceworkers scope, since the serviceworker was not loaded when the page is opened to put currently open pages under serviceworker control, you must reload the page or pages. Until then, requests from this page will bypass the serviceworker and operate just like they normally would service workers maintain control as long as there are pages open that are dependent on that specific version.

This ensures that only one version of the serviceworker is running at any given time. If a new serviceworker is installed on a page with an existing serviceworker, the new serviceworker will not take over until the existing serviceworker is removed. Old service workers will become redundant and be deleted once all pages. Using it are closed. This will activate the new serviceworker and allow it to take over refreshing.

The page is not sufficient to transfer control to a new serviceworker, because there won’t be a time when the old serviceworker is not in use. The activation event is a good time to clean up stale data from existing caches. The application note that activation of a new serviceworker can be forced programmatically, with self dot skips waiting service workers are event-driven installation and activation events, fire off corresponding events to which the serviceworker can respond.

The install event is when you should prepare your serviceworker for use. For example, by creating a cache and adding assets to it, the activate event is a good time to clean up old caches and anything else associated with a previous version of your serviceworker. The serviceworker can receive information from other scripts through message. Events. There are also functional events, such as fetch push and think that the serviceworker can respond to to examine service workers navigate to the serviceworker section in your browsers, developer tools, different browsers, put the tools in different places, check debugging service workers in browsers for instructions for Chrome, Firefox and opera, a fetch event is fired every time a resource is requested.

In this example, we listen to the fetch event and instead of going to the network, returned the requested resource from the cache assuming it is. Their service workers can use background sync here. We start by registering the service worker and once the service worker is ready, we register a sync event with the tag foo. The service worker can listen to sync events. This example listens for the sync event, tagged foo in the previous slide.

Do something should return a promise indicating the success or failure of whatever it’s trying to do if it fulfills the sync is complete. If it fails, another sync will be scheduled to retry retry syncs also wait for connectivity and employ an exponential back-off. The service worker can listen for push events, push events are initiated by your back-end servers through a browsers push service. This example shows a notification when the push event is received.

The options object is used to customize the notification. The notification could contain the data that was pushed from the service service workers can be tested and debug in the supporting browsers, developer tools. Screenshot here shows the chrome dev tools application panel. There are lots of great resources to help you get started and find out more access them from the materials that accompany this article.

In the lab materials that accompany this article, you can practice working with service workers and learn more about intercepting Network requests.


Website management packages are important for any business these days. Check out the video from Allshouse Designs to see what can be done for your company and yes, for how much. 

 

Categories
Online Marketing

JavaScript: SEO Mythbusting

Where do these come from? How do these get into the world The myths, the legends that come through now JavaScript? I think a lot of it is people with very good intentions will try to provide the information they have available and there’s a gap in translation between the SEOs and the developers and how they think and what they consider So by going ahead and adopting is acceptance Criteria as part of my tickets, when I work with devs that lets them know very specifically, instead of being like “, and I want you to make magic for me” And you go from “.

Give me magic” to “, hey here’s, my user story,.” “. I would like to accomplish three pieces for acceptance criteria..”, You can bridge the gap, Hello and welcome to another episode of SEO. Myth busting With me today is Jamie Alberico Jamie. What do you do in your job? Thank you so much for having me here. I’M a technical SEO with Arrow Electronics. That means that I am embedded with a number of dev teams across a number of projects, And we tried to execute these initiatives, get new features available on the site in an effective and search friendly way, And that means a lot of times.

We have to have conversations about how we’re using our JavaScript Having you here is fantastic, because then we can have a conversation about pretty much everything that you want to know from the search side, as well as the web developer side. So… Any questions that you have in mind or anything like pops into your mind. Oh so many questions. I hope I get to poke at the black box of Google here And I have one.

That’S absolutely burning Is JavaScript the devil. That’S a fantastic question. It might seem that way, sometimes, especially when things are not going the way you want. You see the horror stories They’re on forums or on Twitter. Everything is gone. Yeah, that’s one thing: That’s the SEO site on the developer site is also like. Oh, it’s a language that wasn’t designed to be like super resilient, But it actually is and then often people are oh, It’s a C style type language and it’s not really.

It’S a list type language They’re, like a lot of misconceptions, coming from both worlds together and clashing here. I don’t think it is the devil. I think it has its benefits. I mean it allows us to build really cool and fantastic stuff on the web and be really responsive to what the user does and wants to do with our applications. And it has moved the web from becoming or being a document platform towards an application platform.

And I think that’s fantastic, So I think we are already pushing hard on fighting this “ JavaScript is the devil,.” And “. If you use JavaScript, we can’t be indexed at all.”. So that’s not true for for a long time, But I think now the documentation is catching up with like outlining the different Bits and pieces that you should be aware of and the features that you have to deal with that are not available.

One thing, for instance, is you probably have built single page applications right? Oh, yes, Has there been problems in terms of SEO when they rolled out, I I was pretty lucky. I had a dev team who believed in SEO. That’S good, That’s really good. That was actually my the big moment of my career when I got on the technical SEO And I came and I talked to you one of my new developers for the first time with this very specific problem I was trying to solve and he just paused and Looked up from his keyboard and went “ you’re, not snake oil”, So I think we’re making a lot of progress between SEO and devs.

That is fantastic. It’S a great story, So you might hear a few people in in the community going like ooh. Should we do a single page application? Is that risky And one of the things that a bunch of developers are not aware of, and some SEOs are not necessarily communicating all the time is that we are stateless. So that means with a single page application. You have a bit of an application state right, You know which page you are looking at and you how you transition between these pages.

However, when a search user clicks on a search result, They are not having this application. They are jumping in right to the page that we indexed, so we only index pages that can be jumped right into So a lot of the technology. Javascript technology is making assumptions of how The user navigates so the application. So like the developer as a developer. In my test, It’s okay, Here’s my application.

I click on the main navigation for this particular page and then I click on this product and then I see and everything works, But that might not do the trick because… You need that unique URL. It has to be something we can get right to Not using a hashed URL and also the server needs to be able to serve that right away. If I, if I do this journey and then basically take this URL and copy and paste it into an incognito browser Mm-hmm, I want people to see the content, Not the home page and not a 404 page.

So that’s something that we’re working on giving more guidance for lazy loading. You probably have seen a bunch of communication about that. One is probably Yes Yeah. How do we get a rich media experience out to users, but do it in a way where, if you’re on your cell phone, we keep that very small time frame? We have to get your attention Correct and you want to make sure that if you have a long list of content, You don’t bring everything into the especially on the cell phone right, Just feeling, like 100 images, What about Ajax? What about using asynchronous, JavaScript and XML? That is perfect Whoa I haven’t, I haven’t, heard Ajax being used in a while, and it’s fell out in a while.

I mean Everyone’s using it, but no one’s talking about it that much It was just like yeah. You just load data in as you go and that’s perfectly fine. We are able to do that. Also, I often get us about how that affects the crawl budget.., Let’s talk So what worries you about that? Well, if we’re using Ajax and me requests, say a product detail page and we’re using Ajax to supplement a lot of pieces of content to it.

Right, Googlebot’s requested one URL and it’s gotten back nine Yeah, because each of those Ajax calls had a unique string right. How do we handle that and does that negatively impact our crawl budget? So I wouldn’t say it negatively impacts your crawl budget, because crawl budget is much more complex than you might see this It’s one of these things that looks like super simple, but there’s more than meets the eye, We’re doing a bunch of caching right, because we expect That content doesn’t necessarily like update too much.

So Let’s say you have this product page. You make one request to the product page and then that makes nine more requests. We don’t make it. We don’t distinguish between like loading, the CSS or the JavaScript, or the images or the API calls that get you the product details. So if you have nine calls from this one page load, then that’s going to be ten in the crawl budget. Because of caching, we might have some of these in the cache already, So if we have something that is already cached, that doesn’t count towards your crawl budget.

So if we were to version our Ajax calls, yes, those could be cached as those could be cached exactly. Yes and then that’s that’s one way of working around it. If you can do that, if that’s a possibility, The other thing is, you could also consider it not just an issue for The crawl budget, but also an issue for the user right, because, if you’re on a slow network or spotty network connection, It might flake out In the middle – and you were your left foot broken content, That’s not a great user experience.

You want to probably think about, like pre-rendering or hybrid, rendering or server side, rendering Anything in between there And crawl budget is tricky generally, because we are trying to deal with the whole “ host load” situation. So what can your server actually deal with? So we are constantly adjusting that anyway, So it’s like “, oh this affected our crawl budget negatively.”, Not really because we just like had host load issues with your server, So we like adjusted it anyway, so we had balancing issues across your entire content.

So I wouldn’t say that it’s not much of a deal, But I see that it’s very important for people to understand that and unfortunately that’s not that easy. Can we demystify Googlebot a little bit Because we have this, The omnibus, the great the Googlebot, but it actually goes through a series of Actions. So we get that initial HTML parse. We find that the JavaScript and CSS that we need to go ahead and make our content then call those pieces.

We know. Since Google I/O there is actually a gap between our initial parse and our HTML rendering. But I want to know more because Googlebot follows HTML. / HTML5 protocols – Yes, There’s some nuances there. I don’t think I know I didn’t know about Where say: you’ve got an iframe in your head and you’ve got a closing head script right there That ends your head for Googlebot Yeah. All of our lovely meta content, our hreflangs and canonical’s below that have a tendency to exist.

.. That is true, there’s a bunch of things at play. So when we say Googlebot what we actually mean on the other side of the curtain is a lot of moving parts. So There’s the crawling bit that literally takes in URLs right and then caches them from the server, then so that when you are providing the content to us, we get like the raw HTML. That tells us about the CSS, the JavaScript And the images that we need to get and also the links in the initial HTML yeah, and because we have that already we have such a wealth of information already.

We can then start it like go off and fetch the JavaScript and everything that we need to render later on, But we can also already use the HTML that we’ve got and say like “. Oh look, there’s links in here that need to be crawled.”. So when you have links in your initial HTML, we can go off and basically start the same process for these URLs as well. So a lot of things happen in parallel, rather than just like one step and then the next step, and then the next step.

So this is definitely the start of it And as we get the HTML in parallel to extracting the links and then crawling these, we queue them for rendering. So we can’t index before we have rendered it, because a bunch of content needs to be to be rendered. First, In a way that better fits us, if we’ve got a single page application, We now.. Googlebot has the template. They just got to grab the content that fits within there Yeah.

So, wouldn’t that mean that Googlebot likes these JavaScript platforms, The more content you get us quickly in the first step in the crawling step, the better it is because we can then basically carry that information over rather than having to wait for the rendering to happen, But Is prerender always the best solution? That’S a tricky one. I think most of the time. It is because it has benefits for the user on top of just the crawlers, But you have to very carefully measure what you’re doing there, I think so Giving more content over is always a great thing.

That doesn’t mean that you should always give us a page with a bazillion images right away, because that’s just not going to be good for the users, Because they’re going to have to then…. If you’re on a really old phone. And I have a pretty old phone and you have a pages full of images and transitions and stuff, then you’re like.. “. I can’t use this website.”. So pre-rendering is not always a great idea.

It should be always a mix between Getting as much crucial content and as possible, but then figuring out which content you can load lazily in the end of it. So for SEOs. That would be. You know we. We know that different queries are different. Intents Informational, transactional,…, so elements critical to that intent should really be in that initial rush Exactly and you might consider if, if the intents are wildly different And the content is very, very different, consider making it into multiple pages or at least multiple views if you’re Using a single page Application so that you have an entry point for the crawler to specifically point at it when when it comes to surfacing The search results, So treat it like a hub and let the users branch out from there.

Yes, so that’s where we’d use! Maybe our CSS toggle for visibility. That is a possibility just having different URLs, is always an option, especially with the history API. You can probably in the single page application figure out which route to display and then like have the content separated between different routes or Be a little more dynamic. There.. We support parameters, So even if you use URL parameters.

. Basically expose the state that is relevant to the user in the URL. What other ways does that benefit our users, because our ultimate goal is to make them happy And that’s our ultimate goal too. So like we are, we are the same in terms of what our goal is. We both want to surface useful information to the user as quickly as possible, So The users benefits are especially if you do like hybrid rendering or the server-side, rendering that They get the content really quick.

Normally, if it’s done well, if it’s not overloading their device And they get to jump in right where the meaty bits are right. So if I’m looking for some specific thing – and you give me a URL that I can use to go to that specific thing – I’m right there and I’ll have a great time, because it’s the content that I needed So yeah. If you have performance metrics going up as well, then, even if I’m on a slow phone and a really spotty network, I still get there.

I mean our performance metrics, that’s based on a lot of pieces. We have a stack of technology. That is true. What should SEOs look for in our stack? Where should we try to identify those areas where we could have a better experience for not just Googlebot, but our humans Yeah? So I think a bit that is oftentimes overlooked, not by a SEOs But by businesses and developers, is the content part. So you want to make sure that the content is what the users need and want, and it’s written in a way that helps them.

But on the technology side,… Wait So that blurb at the top people always do where the like. Here’S my hero image and then 500 words about this thing And I’m a human who wants to buy something and there’s so much stuff in the way…. Yeah. Don’T do it At least like have two pages have like the promotional page that you want to do direct marketing towards and then, if I specifically look for your product, just give me your product.

Just let me let me give you money, So I think Talking about performance and all the different metrics, it’s a bit of a blend of all the things like Look at. When does my my content actually arrive, when does my page become responsive? So you look at First content for pain. You look at time to first buy as well less important than the first content full paint. I would say, because it’s fine, if it takes a little longer.

If, then, the content is all there Versus.. So time to first byte can take a bit of a hit Yeah If we deliver that faster first, meaningful paint Exactly because in the in the end as a user, I don’t care about. If the first byte has arrived quicker, if I’m Still looking at a blank page because javascript is executing or something is blocking a resource… If it arrives a little later, but then it’s right there, That’s fantastic right and you can get there in multiple ways.

I highly recommend testing testing testing testing. What testing tools would you recommend? So I definitely recommend lighthouse. That’S a great way. Web hint is more More broad approach as well, and you could also use PageSpeed insights or the new SEO audits in lighthouse. Mobile-Friendly test also gives you a bunch of information. Pagespeed insights is look at that full page though Mm-hmm and we had a bit of a bit of a gap.

We have almost this futurist Lighthouse, where we want that time to interactive, and then we have people adopt this methodology. That’S how we got you know so much contact via Ajax, because full page load is fast, but all that content was still coming…. I would recommend lighthouse that gives you like the Filmstrip view of when things are actually ready for the user to work with. So I would highly recommend looking at lighthouse but PageSpeed insight Gives you a good like first first view over and it integrates with lighthouse really nicely now.

Wonderful. Do you think that JavaScript and SEO can be friends now and developers and SEO s can also work together? I do I really think that You know if Google is a library and a webpage is a book using these JavaScript frameworks. Lets us make pop-up books, enrichen experiences to engage with. Oh that’s a fantastic analogy. I love that image. That’S a that’s a beautiful one! Thank you so much Jaime.

Thank you very much and I hope you enjoyed it and see you next time. Have you ever wondered where, on the map, you should put UX and performance when you’re talking about SEO, So have I Let’s find out in the next SEO? Myth-Busting episode,


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.

 

Categories
Online Marketing

Intro to Web Push & Notifications

This diagram gives an overview on the client side. Your webpage interacts with service workers which in turn receive push events via the user agent, also known as the browser and on the backend.

You send messages from your application server to the push service, which then delivers them to the correct client. Let’S look at the notification API first, this allows developers to display notifications to the user. Before we can create a notification. We need to get permission from the user. This code will prompt the user permissions to show notifications. You can try this out from the browser console as you’ll see later, permission is requested automatically when subscribing to a push service.

So there’s no need to call this function when using just push notifications. Let’S take a look at some examples for configuring and displaying a notification from a service worker. We first check that permission has been granted. Then we call show notification on the service worker registration object and pass in the notification title. You can also try this out from the browser console. Try it on the new tab page now for push notifications.

You call show notification in the service worker in response to a push event. When a message arrives, we can specify an optional options: object to configure the notification. This is passed in as the second argument. In the show notification function, the body property is the body text displayed below the title icon? Is the image displayed at the top of notification? Vibrate is the vibration pattern for phones, in this case 100 milliseconds on 15 milliseconds off 130 seconds on so on, data is the arbitrary data we can retrieve in the service worker when the user interacts with the modification.

In this example, primarykey allows us to identify which notification was clicked when handling the interaction in the serviceworker. Let’S try that out. We can add action buttons to the notification that we can then handle each in a different way. Here’S what that looks like notification, interaction events are handled in the service worker tapping clicking or closing the notification. There are two notification interactions you can listen for in the service worker notification.

Close the notification close event only triggers when the notification is dismissed via a direct action on the notification. If the user dismisses all notifications, the event will not trigger, and this is done to save resources, notification, click. If the user clicks the notification or an action button in the notification, the notification click event is triggered. If the user clicked on an action, the action is attached to the event object of the notification click handler.

We can check which action was triggered and handle it separately. Now, let’s see how the two handlers work in a service worker. First notification close: we access the notification, object from the event object and we can get the data from the notification object. We might use the primary key property from the data to identify which notification was clicked in a notification click handler. We can determine what action button.

The user pressed by inspecting the action property on the event object. Note that each browser displays notification actions differently and some don’t display them at all to compensate. We put a default experience in this example in an else block after checking which action was clicked so that something will happen on a simple click of the notification. Now, let’s see how you send push messages from your server and handle incoming messages on your client web app.

Each browser manages push notifications through its own system called a push service when a user grants permission for push on your site, you subscribe them to the brow. Push service: this creates a subscription object that includes a public key to enable messages to be encrypted and an endpoint URL for the browser’s push service, which is unique for each user from your server. Send your push messages to this URL encrypted with the public key.

The push service sends the message to the right client. Now the service worker will be woken up to handle incoming push messages when a push event is fired, and this allows your app to react to push messages. For example, by displaying a notification using service worker registration show notification, your app doesn’t need to listen to or Pole for messages and the browser doesn’t even need to be open.

All the work is done under the hood as efficiently as possible by the browser and the operating system, and this is great for saving battery and CPU usage. Let’S go through that step by step in the apps main JavaScript call push manager subscribe on the serviceworker registration object, get the subscription object and convert it to jason, get the endpoint URL and public key and save this to your server, for example, by using a fetch Request send the message payload from your server to the endpoint URL encrypted with the public key.

The push message raises a push event in a serviceworker which we can handle in a push event handler in push event handler. We get the data from the message and display a notification. The push API allows users to subscribe to messages sent from your app server that are sent via the push service used by the browser and subscribing, of course, is done in the JavaScript. For the page, responding to push events, for example by displaying a notification, is done in the serviceworker, just to repeat subscribing to the push service and getting the subscription object happen in the JavaScript for the page.

First, we check if the user is already subscribed and update the page UI accordingly, if they are not subscribed, prompt them to subscribe, if they are already subscribed, update the server with the latest since that may have changed by the push service, since it was last used When the user grants permission for push on your site, you subscribe them to the browsers push service, as I said before, this creates a special subscription object that contains the endpoint URL for the push service, which is different for each browser, along with a public key.

We send the subscription object for this user to the server and save it now before you subscribe a user check if you already have a subscription object, if you don’t have the object again update the UI to prompt the user to enable push notifications, and if you Do have the subscription object, update your server database with the latest subscription object. The ready property of the service worker defines whether a service worker is ready to control the page or not.

It returns a promise which resolves to a serviceworker registration object. When the service worker becomes active, the get subscription function returns the subscription object or undefined. If it doesn’t exist, we need to perform this check every time. The user accesses our app because it is possible for subscription objects to change during their lifetime. This is the process of subscribing to the push service register, the service worker from the main page main jeaious.

This request goes to the user agent. The user agent returns the service worker registration, object, use the service worker at registration, object to access the push manager API and from that requests are subscribed to the push service. This request is passed on to the push service. The push service returns. The subscription object, which includes the endpoint URL and the public key, save the subscription object data to your server and send push messages from your server to the endpoint URL encrypted with the public key.

Like I said now before sending notifications, we must subscribe to a push service. We call push manager subscribe on the service worker registration object to subscribe and the resulting push subscription object includes all the information. The application needs to send a push me such an endpoint and encryption key needed for sending data each subscription is unique to a service worker. The end point for the subscription is a unique capability.

Url knowledge of the endpoint is all that is necessary to send a message to your application. The endpoint URL therefore needs to be kept secret or other applications might be able to send push messages to your application. Here’S an example of the subscription object. This is the object returned from the push service. When we call reg push manage, add subscribe. The subscription object has two parts.

The first part is an endpoint URL. The address on the push service to send messages to this includes an ID that enables the push service to send a message to the correct client and service worker. The second part of the subscription object is the keys property. The p25 6d H key is an elliptic curve, diffie-hellman ECD H public key for message. Encryption. The earth key is an authentication secret that your application server uses in authentication of its messages.

These keys are used by your application, server to encrypt and authenticate messages for the push subscription and, let’s see how the process of sending a message is done. The server generates a message encrypted with the public key and then sends it to the endpoint URL in the subscription object. The URL contains the address of the push service along with subscription ID, which allows the push service to identify the client to receive the message.

The message is received in the push service which routed to the right, client and the process of sending a push message from the server works. Like this. A back-end service on your server sends a push message to the push service using the endpoint URL from the subscription object. The message must be encrypted with the public key from the subscription object. The push service uses subscription IDs encoded in the endpoint URL, to send the message to the right user agent.

The push event is picked up by the service worker. The service worker gets the data from the message and displays a notification in this example. We’Re using Google’s web push library for nodejs to send a push message from a node.Js server. The TTL value in the options specifies the time in seconds that the push service should keep trying to deliver the message now. This is important to set correctly some messages.

Have a short life some may be valid for several hours or more. We then pass in the subscription object. Payload and options object to send notification. You need a way to ensure secure communication between the user and your server and between your server and the push service and between the push service and the user. In other words, the user needs to be sure that messages are from the domain. They claim to be from and have not been tampered with by the push service you need to make sure the user is who they claim to be valid, was created to solve this problem.

This vapid identification information can be used by the push service to attribute requests that are made by the same application server to a single entity. This can be used to reduce the secrecy for push subscription URLs by being able to restrict subscriptions to a specific application server. An application server is further able to include additional information. The operator of a push service can use to contact the operator of the application server in order to use vapid, we need to generate a public/private key pair and subscribe to the push service using the public key.

The public key must be first converted from URL base64 to a you in 8 array. This is then passed into the application. Server key parameter in the subscribed method. The web push library, provides a method generate vapid keys, which generates the keys. This should be used once in the command line when push generate vapid, keys, Jason and the keys stored somewhere safe. We can use the web push library to send a message with the required vapid details.

We add a vapid details, object in the options parameter. That includes the parameter required for the request signing now. Let’S look at messages from the receiving end in the web. App on the client handling push, events happens in the surface worker, the service worker will be woken up to handle incoming push messages and a push event is fired. This allows your app to react to push messages, for example, by displaying a notification using service worker registration, show notification to display a push notification.

You listen for the push event in the service worker. You get the push message. Data from the push event object in this example, we simply convert the message: data to text The Wrap, show notification in a wait until to extend the lifetime of the push event. Until the show notification promise resolves, the push event will not be reported as successfully completed until the notification has displayed.

You can practice working with the notification and the push API by following the lab that accompanies this article, one small gotcha, don’t use private or incognito mode for this lab for security reasons, push notifications are not supported in private or incognito mode. You


Website management packages are important for any business these days. Check out the video from Allshouse Designs to see what can be done for your company and yes, for how much. 

 

Categories
Online Marketing

Indexing your PWA (Discoverability & SEO) – Progressive Web App Training

Every search engine has a different way of ranking pages, but they all depend on a web crawler to gather information, and when you build a JavaScript driven site, the crawler might not be able to find everything you might need to give it a little help.

While every search engine has its own way of crawling, there are two fairly obvious rules. First, if the crawler can’t see it, it’s not going to be indexed and everything needs its own URL. There may be a trivial solution for your site if customers always search for a landing page or other static content, but those pages be static content. This won’t index client rendered content, but that may be exactly what you want.

This does raise an interesting distinction. A PWA does not have to be a single page app, you could add a serviceworker do every page in a website or a multi page app. As long as these pages have the same origin and path, they will share a serviceworker. Another option is to serve a render the dynamic content and then let the client take over rendering this lets any crawler see and index. All of your content.

You can use these solutions with any crawler since there’s no JavaScript involved, and if you want your app to be indexed everywhere, you’ll have to render it on the server. You can write code that renders on the client or as server-side JavaScript, it’s called isomorphic JavaScript, but that assumes you’re using node or another JavaScript server. And if you want an easy test, you can run lighthouse.

It includes some basic SEO. Discoverability tests lighthouse runs some basic SEO tests as if you have an HTML only crawler each test has instructions for fixing or improving shortcomings. Okay, so the universal answer is not to depend on JavaScript, but Google’s crawler can run JavaScript. So you can index client rendered sites. As long as you follow some rules, there are about a dozen rules, but the top five will take you most of the way we’ve already covered.

The first rule make your content crawlable. That means rendering it so the crawler can find it. If you’re writing a single page app, the top five rules become these top five tips. Many developers provide navigation links with a hash for the URL and use a click listener. Instead, these should point to actual paths in your app to trigger changes. You also need to avoid URL fragments the part that begins with a hash sign these break many tools and libraries and are now deprecated.

We used to recommend hash-bang prefixes for crawling a jet-powered sites as a way to change URLs without reloading the page. But now you should use the history API. Instead, the next rule is to use canonical URLs for duplicate content. For example, amp pages normally have a server rendered page and the client rendered amp page. The client rendered page has a link back to the server rendered page using the rel equals canonical attribute.

The crawler will index the canonical server rendered page some developers, even shadow, their client rendered pages, with server rendered pages and use the canonical link to point back to the server. This makes more of the app discoverable tip. Number 4 also gives you great accessibility, use the native HTML elements whenever possible. Crawlers know what to do with an actual button, but won’t recognize a div of class button in the same way, finally use progressive enhancement, use polyfills, where it makes sense to support older browsers.

You never know which version of a browser is used in a particular crawler, so play it safe. Some simple changes can improve your data quality and give users much better results. One is to use the schema.Org annotations for structured data there, a predefined schema for common areas, such as e-commerce, scheduling and job postings search engines, can use the schema annotations to parse your data accurately.

The same logic applies to the Open Graph protocol, which allows any web page to become a rich object in a social graph. Finally, the Twitter cards provide a rich media card that displays, when anyone links to your site from Twitter, it’s important to test your work and work iteratively. So the you can see the effects of each change. Testing on multiple browsers is not only a best practice for everyday development.

It ensures your site renders correctly on multiple crawlers testing with the Google webmasters search console will crawl your site and show the result, and you should always pay attention to loading performance. Use tools such as PageSpeed insights or webpage tests to measure the loading performance of your site remember about 40 % of consumers will leave a page that takes longer than 3 seconds to load.

Of course, the most important rule is to treat client-side rendering as a progressive enhancement. If you test on a range of browsers, you’re, probably fine. If you want to be certain, you can use the fetch as Google tool on the site. If that went by a little fast see the Google Webmaster central blog for the details on how to make your PWA search ready. Then come back here and I’ll. Tell you how to measure user engagement in your PW A’s thanks for reading


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.

 

Categories
Online Marketing

Web Performance: SEO Mythbusting

Myth busting with me today Is Ada Rose, Cannon and you’re working for Samsung. Is that right? What do you do at Samsung? So for Samsung? I’M a developer advocate for the web browser Samsung Internet Samsung Internet is a web browser for Android phones.

You can get it from the Play Store, but not a lot of people have heard about it. So there’s lots. What I do is trying to raise awareness, but more importantly than that, What I’m trying to do is advocate for the web as a platform to try and encourage developers to build for it and to Make sure it lasts long into the future. As a great and healthy platform for people to build stuff with, I love to have you here, because I want to talk to you about the SEO Versus performance and usability on the web, and I think we need to get some stuff out of the way right.

So will you say what is the most important bits and pieces that you would like people to focus on more when building web stuff? So I have a huge passion for ensuring that That the web remains great for everyone around the world, Not just on people using the latest handsets and on desktop computers, Because most people aren’t people are using devices from years ago and low-end sub $ 100 devices, where, Frankly, today, the the modern web is just not even reaching them There’s a fantastic talk from Alex Russell Who goes into the the reality of people with phones that are less than $ 100 And yeah.

That’S a that’s! A fantastic one. You’D have the naive thought that, as time goes on, that phones are getting steadily better and at a bottom of the line phone is Nowadays is just as good at the top of line phone four years ago, when they’re, not It’s just getting wider and wider Wider. The chasm is opening, rather than anything else, What was really awesome. I heard recently Google was doing performance metrics into Into their ratings for search results.

I don’t So was this front-end web performance like render speed, making sure it’s not janky, or is this just Making sure that a page loads really quickly? So it is a little… It’S a it’s a tricky one, because we have so many metrics right. We have Time to first bite. We have time to interactive, We have time to first meaningful paint, and then you have like the frame rates and stuff.. Now Googlebot, which is the tool that basically fetches the data and renders your website for for Search Indexing.

We don’t really interact that much with the page, So we can’t really get like figure out if your scroll is smooth on something like that, but we do get the rendering bits. So we can tell you, when the page becomes responsive to inputs, when the content is ready for the user to consume, So we’re looking at the blend of these kind of modes Yeah of performance. Does that make sense? It does make sense.

So do you have any other qualms with like how SEO influences the daily work of a web developer? So a friend of mine recently rebuilt her site using React. She was very excited about it and and seemed to get quite good to client side performance Once it all loaded. Unfortunately, when she sent it out to her company’s Team to which does SEO analysis, they came back with an answer of. We love your site.

It’S really good, But you’ve basically don’t appear in the rankings, even though she could show them that look right there. It’S on Google Is Google engaging with people who do SEO analysis to ensure that They’re running up-to-date metrics, the similar ones to Google to ensure that even a heavily client-side rendered page, and They can feel confident that it is being measured. Well, So we can’t really Fix what people are doing in terms of what they were tools.

They’re using or something, But what we do want is we want to open this black box of SEO for everyone, So we’re having this conversation web developers We’re having this conversation with Seo and tool makers and we provide a bunch of metrics and tools as well. So we have search console that gives you a bunch of insights and how you’re doing in search so that you’re, not relying on someone else. Basically, sticking the finger in the wind and Reading the stars and stuff, and we also.

We also want to make sure that people are understanding that blanket statements like JavaScript’s going to kill your SEO or you cannot use, React or Angular that that’s not necessarily the best way of doing it. It’S a really comfortable answer, probably mmm. It’S not the right answer. Sometimes All right so at Chrome, Dev Summit. I saw your great talk on SEO in the web. Thank you, And one thing you mentioned was the the rendering for By Googlebot to actually process a JavaScript heavy site could take up to a week to happen.

Does this mean that JavaScript heavy sites are effectively getting penalized in Google Search results right, They’re, not getting penalized, so they are ranking just fine, but the indexing stage is where the problem is because, as you say, we are processing by putting them first into a rendering Queue and then eventually, As we have the resources available, We are rendering them and if the resources take a while to actually render That means that we cannot refresh the content in the index as quickly so News sites might want to look into that.

But then again you have usability issues anyways right, Yes right and that’s because that’s bad for the user. We try to find search results that are good for the users and If a page takes ages to load. That is not a good experience for me. So you want to fix that because of the users, not necessarily just because of the crawler. So if a page is Built using, I know I have a bit of a bias against these JavaScript heavy front-end client-side Rendered pages, because they’re terrible for everyone who doesn’t have like an iPhone or latest Pixel, or something Yeah or a desktop computer.

But anyway, for these sites, if the way they make their money is delivering fresh content daily. Does this mean that the content in the search results may actually be like out of date? For them They might be lagging? Then? Yes, absolutely And I think again, like it’s very important to get the users a great experience, and I don’t think you can do that when you are Heavily relying on client-side rendering, because good Devices might be really old, So yeah one way of working around.

Unless you want to Properly fix this and do hybrid, rendering or server-side rendering One way around of that is to do dynamic, rendering and basically like give us a static rendered version of your page for the crawler So that we can index it quicker. But that’s not making the user usability and user experience problems going. So what do you say it’s generally safer to rely more on on latest HTML and CSS, Knowing that they degrade more gracefully than JavaScript? Yes, Don’t speak! If you look at the the tristar of Technology that we have in the web platform like HTML, CSS and JavaScript, HTML and CSS are just more resilient than JavaScript, and so Relying on JavaScript, too heavily is always going to probably get you into trouble with certain ways And spotty network connections and stuff, So I would say, use polyfills use, progressive enhancement, use what the web platform offers you and use JavaScript responsibly Yeah.

It’S really great to hear, especially from a Googler that, like reducing reliance on JavaScript and Taking advantage of good HTML and CSS, where it’s available can can actually wonders for your SEO. Absolutely Ada. Thank you so much for being here and talking to me about performance and SEO, and I do you have a feeling that SEO and web developers can work together nicer or is there still…? I think, as long as the goals of what people are trying to accomplish are clear and we’re not just like resorting to auguries or looking at the stars to work out what Google is thinking, then it’s going to Enabled developers to actually build sites that make sense And take advantage of that platform, Anything Google can do to ensure that the web works for everyone and not just in the wealthy Western web, then It’ll be really really fantastic.

Fantastic closing words. Thank you so much for being here. Thank you This just in the next episode of SEO. Myth-Busting is going to be about SEO in the age of frameworks, Jason Miller, and I will talk about what that entails. So stay tuned on this blog Subscribe to Google webmasters and see you soon.


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.

 

Categories
Online Marketing

How to hire an SEO

Optimization some SEO seems like black magic. Having worked with Google search for over a decade, what I’ve learned is that, first, it’s not black magic and second, if you want long-term success, there aren’t any quick magical tricks that an SEO will provide so that your site ranks number one.

It’S important to note that an SEO potential is only as high as the quality of your business or website so successful SEO helps your website put your best foot forward so that it ranks appropriately in the spot, where an unbiased potential customer would expect your site to Be seen a successful SEO also looks to improve the entire searcher experience from search results to clicking on your website and potentially converting a good SEO will recommend best practices for a search friendly site from basic things like descriptive, page titles for a blog or small business.

To more complex things like language markup for a multilingual global site SEO is ensure that you’re serving your online customers a good experience, especially those coming from a search engine, and that your site is helpful, whether they’re using a desktop computer or mobile phone. In most cases, the SEO will need four months to a year to help your business first implement improvements and then see potential benefit.

My strongest advice when working with an SEO is to request if they corroborate their recommendation with a documented statement from Google, either in a Help Center article, article or Google, a response in a forum that supports both one the SEO description of the issue. That needs to be improved to help with ranking and to the approach they prescribed to accomplishing this tasks. Requesting these two bits of information will help prevent hiring a poor SEO who might otherwise convince you to do useless things like add more words to the keyword, meta tag or by links, because if you search for google advice on this topic, you’d see blog posts and Articles from us that clearly explain that adding keywords to the meta tag wouldn’t help.

Furthermore, while google uses links for page rank, our documentation highlights that we strongly advise against the approach of buying links for the purpose of increasing page rank. One basic rule is that, in a majority of cases, doing what’s good for SEO is also doing what’s good for your online customers. Things like having a mobile-friendly website, good navigation and building a great brand. Additionally, if you’re a more established brand with complicated legacy systems, then good search, friendly best practices likely involved, paying off some of your site’s technical debt, such as updating your infrastructure, so that your website is agile and able to implement features faster in the long term.

If you own a small local business, you can probably do the initial work yourself check out our 30-minute article series on how to build an online presence for your local business. Now, if you still believe you want to hire an SEO, here’s a general process, one conduct a two way interview with your potential SEO check that they seem generally interested in you and your business to check their references.

Three act: four and you’ll probably have to pay for a technical and search audit 4. Decide if you want to hire, let’s break this down and start with step 1 conduct. A two-way interview in the interview here are some things: to look for. A good SEO. Doesn’T focus only on search engine ranking, but how they can help your business, so they should ask questions like what makes your business content and/or service, unique and therefore valuable to customers.

They want to know this information to make sure it’s highlighted on your website. For your current and potential new audience, what does your common customer look like and how do they currently find your website 3? How does your business make money and how can search help for what other blogs are you using offline advertising social networks 5? Who are your competitors? What do they do well online and potentially offline if the SEO doesn’t seem interested in learning about your business from a holistic standpoint, look elsewhere, it’s difficult to do good SEO without knowing about a business’s goals, their customers and other existing marketing efforts.

Seo should complement your existing work. The second step in hiring an SEO is to check references if your potential SEO provides prior clients be sure to check their references. You want to hear from past clients that the SEO was able to provide useful guidance and worked effectively with their developers, designers, UX researchers and our marketers. A good SEO should feel like someone you can work with learn from experiment with and who generally cares about you and your business, not just getting your site the highest rank as ultimately those techniques rarely last long if they work at all, they’ll want to educate you And your staff on how search engines work so that SEO becomes part of your general business operations.

Step 3 is to request a technical and search audit. If you trust your SEO, candidate, give them restricted view, not full or right access to your Google search console data, and even your analytics data before they actually modify anything on your website have them conduct a technical and search audit. To give you a prioritized list of what they think should be improved for SEO, if you’re a larger business, you can hire multiple SEO to run audits and prioritize improvements, see what each has to say and then determine who you could work with the best in the Audit the SEO should prioritize improvements with a structure like one the issue to the suggested improvement 3: an estimate on the overall investment, in other words the time energy or money.

It would take for your developers to implement the improvement and for Google search as well as searchers and customers to recognize the improvement. The SEO will need to talk with your developers to better understand what technical constraints may exist for the estimated positive business impact. The impact might be a ranking improvement that will lead to more visitors and conversions, or perhaps the positive impact comes from a back-end change that cleans up your site and helps your brand be more agile in the future.

Five, a plan of how to iterate and improve on the implementation, or perhaps how to experiment and fail fast, should the results not meet expectations that covers the structure of the technical and search audit. Now, let’s talk about each of these audits individually in the technical audit, your SEO should be able to review your site for issues related to internal, linking crawl ability, URL parameters, server connectivity and response codes, to name some if they mention that your site has duplicate content Problems that need to be corrected make sure they show you the specific URLs that are competing for the same query or that they explained it should be cleaned up for long term site health, not initial growth.

I mention this because lots of duplicate content exists on web sites and often it’s not a pressing problem in this search audit. Your potential SEO will likely break down your search. Queries into categories like branded and unbranded terms. Branded terms are those with your business or website‘s. Name like a search for Gmail is a branded term, while the search for email is an unbranded or general keyword.

An SEO should make sure that for branded queries such as Gmail, your website is providing a great experience that allows customers who know your brand or website to easily find exactly what they need and potentially convert. They might recommend improvements that help the entire searcher experience from what the searcher sees in search results to when they click on a result and use your website for unbranded queries.

An SEO can help you better make sense of the online competitive landscape. They can tell you things like here are the types of queries. It would make sense for your business to rank, but here’s what your competition is done and why I think they rank where they do. For instance, perhaps your competition has great reviews, really shareable content, or they run a highly reputable site. An SEO will provide recommendations for how to improve rankings for these queries and the entire searcher experience.

They’Ll introduce ideas like update, obsolete content. They might say your site is suffering because some of your well ranking content is obsolete. Has poor navigation a useless page title or isn’t mobile-friendly, let’s improve these pages and see if more website visitors convert and purchase or if they can micro convert, meaning that perhaps they subscribe or share content improve internal linking your SEO might say your site is suffering because Some of your best articles are too far from the homepage, and users would have a hard time finding it.

We can better internally link to your content to feature it more prominently, generate buzz. The SEO might say you have great content, but not enough people know we can try to get more user interaction and generate buzz, perhaps through social media or business relationships. This will help us attract more potential customers and perhaps garner natural links to your site. Learn from the competition your SEO might explain here’s what your competitors do well.

Can you reach parity with this and potentially surpass them in utilities? Or can you better show customers your business’s, unique value again, a good SEO will try to prioritize what ideas can bring your business, the most improvement for the least investment and what improvements may take more time, but help growth in the long term. Once they talk with you and other members of your team, such as developers or marketers, they’ll help your business forge a path ahead.

The last thing I want to mention is that when I talk with SEO s, one of the biggest holdups to improving away site, isn’t there recommendation, but it’s the business making time to implement their ideas if you’re not ready to commit to making SEO improvements. While getting an SEO, audit may be helpful, make sure that your entire organization is on board else. Your SEO improvements may be non-existent, regardless of who you hire so that wraps it up thanks for reading and best of luck to you and your business


You have to try the best pumpkin seed snack from Spunks! Learn about the creators by watching the video below.

 

Categories
Business Tips Online Marketing Search Engines

Does Your Digital Marketing Research Pass The Test? 1 Thing You Can Improve On Today!

Digital Marketing Research is an ever-changing game that online marketers are constantly attempting to win and scale. Following the latest trends, trying to figure out the newest apps and social platforms, and keeping up with their development is a full-time gig. In the last decade digital marketing has become a necessary evil for business to explore, create and execute. We have designed, created, deployed and seen successful campaigns and ones that have fallen flat. When starting out we tripped over a ton of mistakes. It is natural and we have always gotten back up to hack at it again. The one thing that we have gained from all of our work is the one fact.

Digital Marketing Research is a must!

The first is Know your audience!

Where do you go to do that? There are many paid services that will give you a ton of data on where to put your digital content, ads and promotional information. However, there is one tool out of the many tools we use that works the best and when starting out your online marketing campaign. The most useful and important tool to then is Google ads Keyword tool! It is free and is easy to use, once you get a hang of it.

Google’s Keyword tool

This is a free service from Google when you open a Google ads account. This tool has some powerful features and will make a difference in your efforts to spread you brand around the virtual world. If you are not sure how to use the Google keyword tool. Check out the video below.

Thank You for visiting and I hope you have a wonderful day! This post has been Sponsored by Farmers Agent Matt McPherson. Here he is letting you know about car insurance in Arizona.

Categories
Business Tips Online Marketing Search Engines

You Should Really Advertise your Content!

When designing a website and beginning your online marketing journey, content is king. Your content might better describe the services you provide, it might provide answers to specific questions that customers have, it might even have advice and tips on matters related to your services or your target market. Writing great content is as important to your site as good design, if not more important. But if you’re not advertising that content, you could be missing out on its full potential.

Improve awareness

Your content is one of the most valuable aspects of your website. If it’s written well, it’s designed to draw conversions and referrals. It’s marketing material, so treat it as such. Make sure that it’s visible. Content that isn’t visible won’t be read and if it’s not read, it’s a waste of your time and efforts. Advertising your content improves awareness of it across the board. It becomes significantly more visible, drawing in new readers and unique site users. From there, those new site users can go on to continue exploring your site and could become brand new customers. Content is one of the most cost-effective ways to turn traffic into customers, but only if people can actually see it.

Search engines will love you

The vast majority of internet user use search engines like Google on a regular basis. If they have a problem they don’t know how to fix, they will look at Google. If they want to find out who provides a specific service in their area, they will Google it. If they want to stay updated on a certain issue, they will search for it. Advertising your content well improves your search rankings, meaning that the content is more likely to appear at the top of searches relevant to what it offers the reader. But it also improves the search ranking of your website as a whole, not just that one page. That means any further content or web pages you create will benefit from the boost provided by content marketing.

Build a stronger brand

If your content is written well and provides real value to readers, whether describing a service or providing insight, information, and advice, it also has real value to the business. It does more than fill in the gaps of your website. It can be used to establish a better online presence. The more fresh, quality content you have out there, the more your business grows an online brand as a place for expert information. It builds a legitimacy and prestige through quality information that means that people are much more likely to trust the content, trust the website, and thus trust the brand. With the sheer number of dubious sites and businesses on the internet, building that trust and standing out as “one of the good ones” is crucial.

Go social

The internet is becoming a more and more interconnected experience. Now, we don’t just consume what appeals to us, we share it with the world, as well. A social media presence is almost essential for a business hoping to grow online. But that social media presence isn’t just about having accounts and talking to the community. By advertising your content on social media, you make it much easier for others to share. Share worthy content can get you significantly more referrals and conversions that direct marketing. Take advantage of the online community’s habit of sharing what they love. Craft content they will love, market it, and watch them continue to market it for you.

Gathering data

The information you gain from the analytics tracking not only those who click those advertisements but those who go on to read your content. It helps you further understand who you’re writing to, what they want, and what content strategies work best for you. Content marketing is also one of the primary ways that business build leads to later convert. In fact, companies that blog or produce content regularly generate 62% more leads on average than those that don’t. Create “lead bait” content, content that’s designed to grab the reader’s interest, and urge them to learn more about you. The more they learn about the business, the more likely they are to become partners. You can then go on to further nurture those leads with impressive calls-to-action or email marketing signups.

The benefits of advertising your content, not just your services, are clear. Content targets the market’s wants directly, improves your brand’s reputation and can create unsuspecting new customers. Let Allshouse Designs take your content to the next level and fulfill its true potential by advertising it directly to your market.

Allshouse Designs can help market and advertise your website content. Learn how by looking at our Digital Marketing and Advertising service. As always your opinion matters to us. Feel free to post your comments below or connect up with us on Facebook.

Categories
Business Tips Online Marketing Web Development

What is Bounce Rate in Google Analytics

Based off of our research, we found the definition of “Bounce Rate” to be the percentage of visitors to a particular website who navigate away from the site after viewing only one page.

Now why is bounce rate so important?

In my eyes, I see bounce rate as an indicator of how well your website is put together and the quality level of users that are visiting your website. Basically, If your average bounce rate is 85% this means that 85% of visitors to your site are not going any farther than that first page that they were directed to.

So what does that mean for you?

Well there are two aspects that you should be looking at when you have a website that produces a 85% bounce rate. First, is this the rate you are looking for? For example, if you have a landing page and that is the only page you want your users to visit. Maybe it is a page with a subscription button, buy now button, or a web form to fill out. You would want a high bounce rate because you are not looking for your users to go anywhere else on your website. However, If it is to high and this is where the second aspect comes in. A really high bounce rate could indicate that you are not attracting the correct users, slow load time with your landing page, or your landing page is not designed correctly to take full advantage of the user’s visit.

In a nut shell, Bounce rate is a good metric to pay attention to. We shed more light on Bounce Rate in our latest YouTube video. “Tools & Tips – Episode 4 | What is Bounce Rate?”

As always we are growing and learning everyday. This is a great post for people to add their opinions about our video or bounce rate in the comment section below. We really enjoy this kind of stuff, so your comments are really appreciated.

Categories
Business Tips Online Marketing Search Engines Updates for Allshouse Designs

What’s New In the world of Google My Business

Man Yellow through a mega phone about Google My Business

So what in new with Google My Business? Well if you haven’t seen or read up on the new functions. Keep on reading, cuz I am going to tell you about them.

Posting function

The number one new function is that you can now post messages and links to your profile! WHAT!!! This is a true game changer for locally listed business. Personally, I think Google is tired of local business getting wiped out on search rankings by national companies. Makes sense to me. We have already started to use the function on our own Google My business listing. to see it, just Google Allshouse Designs. You may be reading this because you clicked on the post that I created on our Google My business listing.

Chat Function

Holy cow! You can now chat with your potential or existing clients using your texting app on your mobile devise. WHAT! That’s right! A client can now chat with your business through the listing.

Webpage function

Yes, you can build a webpage for your business. It lives on Google’s servers and is instantly connected to your google listing or you can choose to keep your own website site connected. You can check ours out here.

Let Allshouse Designs take care of your Google My Business Listing. We will fix it up, make sure everything is being used properly and create one post on your listing to get you started. Unfortunately, this does not include the virtual tour function. But everything else will be done for a one time payment of $199 USD. Just fill out our request a call back form to get started.

I wrote up few good reasons why you should have a Google My Business Listing. Please check it out. If you have any questions or concerns. Please post your comments in the comment section below.