Unlock this video + Thousands more !

Start your free 30 day trial today!

Join Free Sign in

Site Speed for Digital Marketers

Speakers

Mat Clayton , Mixcloud

Specialties: Python, Django, MySql, PIC Controllers, Embedded C/C++... Read More

What you will learn?

  • Know what factors affect site speed
  • Pinpoint specific elements that could be dragging down your speed
Video Transcript
Site Speed for Digital Marketers

[00:00:28]
Mat Clayton:
Thanks, Rob. Yeah, so I'm here today to talk to you about site speed and I'm going to, as Rob said I'm actually an engineer at heart. So what I'm going to try and do is give you guys the skill set as marketers to talk to people like me as engineers on where we should go looking to speed up the site. So this is something I do day in day out, try and speed up websites, specifically one site. And what I'm going to try and do is point you in the right directions so you can point me in the right direction or the people like me on your team.
So Rob covered this really quickly, but why should you care about site speed? It seems obvious but there's several reasons. I'm going to reiterate those two you know.
So the first one is, Google said this matters for rankings. They said it back in 2010 and then they reaffirmed it in 2018 in January. So it matters for mobile rankings as well. This will come as no surprise to anyone. Speed matters. It matters to Google, it matters to us, it matters to you guys, and it also matters for conversion.
Try clicking again.
So every time anybody runs a survey or they run a test to do with conversion, the faster your website goes, the more people convert. It just happens time and time again. Every single time. Every survey. I've never seen one which says my site ran slower, I looked at the page for longer; therefore, I decided to buy something. I don't think it's ever happened. So we know it helps conversion.
We also see that it actually helps indexing. So there’s two things. There’s page speed and there’s site speed. I will go into the difference of those later on in this talk. This is regrettably not out data because I actually had a graph like this ready to go for you guys and then I lost my laptop. So I found somebody else's who's found the same properties.
And what we see here and hopefully you can see it over there as well. Is that the faster you make your website, the more time Google will have. Well they will have the same amount of time but the more pages they will get because each page is downloading faster. So they seem to allocate a certain amount of resources to you. And there resources are essentially measured in CPU or time and the faster your pages are the more you'll get.
We actually see similar properties as well when you start looking at where you put your JavaScript in your CSS. Although regrettably I don’t have those with me today. Cool.
So it’s just a better experience like which site would you prefer to have, the one which loads fast or the one which loads slow?
So here we've got the same page and we've actually simulated this. So this is one of our properties. The right hand one is simulated using a 4G connection, the left and one is actually simulated using a 3g connection. It's pretty clear which one the user would prefer to use.
So there's multiple ways we can do this. We can either speed it up by moving assets around making them faster, making them slower and I'm going to try and help you today by guiding you to those points. And so I’ll let you see where your site is slow and how to speed it up.

[00:03:12]
What are Page Speed and Site Speed?
So let's start with, sorry the clicker’s being…
What are page speed and site speed? What is the difference between the two of them? And why do they matter? And why is there two definitions here?
Let’s start with page speed. So page speed is very simply, it's when there's trucks as Tom very elegantly said earlier, go to your server and then they come back again. It's when that first truck comes back, it’s the first bit of the truck actually. It's the first byte. So the bumper crossing the line in Tom speak.
So we measure this in what's called TTFB which is time to first byte. So that's the first byte coming back from the server on the first request, typically HTML. So it's usually the start of an HTML tag. And this matters because this is kind of the fundamental measurement of when the document arrives. And that actually when you look at this, this is only 20% of the problem. Once you've got that document to the browser, there’s 80% of the time left. That's the time downloading the CSS, the JavaScript, the fonts, getting them all up and running. So that's past executed and beyond that, and we'll get through this in a bit. But don't forget about this. And this is what we call page speed.
Richard Baxter said this pretty well the other day and I kind of follow Richard on Twitter. And he sums it up as this. So a lot of SEOs don't really realize how expensive DB queries can affect page load time. You're running heaps of queries to build a page, fetch navigation filters, attributes, on a product page. The most effective thing you could do is start there.
Essentially the takeaway from this is don't forget about how slow your service are. Try and find ways of removing content from the page to speed them up. It's a simple lesson but this is actually not the core of this talk. But I wanted to put that out there because I realized like the more marketers I speak to, I just want to put more stuff on the page to convert and actually you might be slowing things down. You might be making your conversions worse. So don't forget about making the server faster.
What is site speed? And this is what we think about it. This is how my team thinks about it. When you load a webpage, it goes through various stages.
Let me start at the left. You initially have first paint and this allows the user to realize something is going on. And you've all seen this. Every time you load a web page, you see this. Something flickers and then after that you have first content for paint. That's like some bit of content has arrived.
You move on a bit further and then you have first meaningful paint. We think of this as the first bit of content we got to the user which actually has real meaning and it's helped deliver towards their own. Remember these are users. These are people who actually have an objective. So in this case, this is Google, searching for Google’s CEO. We've actually got the definition though. This isn't a fully rendered page at this point, but it is enough that the user has got value from it.
The final one is what we call TTI. So time to interactive. This means the page is fully loaded. You can click it, the images in there. You can scroll. It's fully interactive.
And when we think about site speed, what we're actually thinking about, you know in our terms and what most people think about is these final two steps. How do we get content on the page which is meaningful and how do we make it interactive? And how do you do those two things happen way way way faster.
So that's what I'm going to focus on today. Not the server side and how you impact burst. Because this is where I think you as marketers can look at the source of a page and start making little tweaks, which will make a massive difference to that 80% of the load time which is spent here, not the 20% from server.

What makes up a website?
So what makes up a website? Hopefully you all are aware of this. This is JavaScript, it’s CSS, it's HTML. So that good stuff. But let's actually look at how that loads and why it loads the way it does.
So Tom mentioned this earlier. Webpagetest.org is a fantastic service. I highly recommend you go and put all of your domains through this as soon as possible. And what you can do is plug in your domain or your competitors’ domains. We’ll see it in a bit. I'll run through some not exactly competitive areas, but some other services which are interesting to take a look at.
So you put it in there, you click start test, and then one of the first reports you'll get back is this. It will show you what your site is made up of. In our case, this is one of my main properties. You'll notice that we have images, JavaScript, HTML, although in our case that's API calls, so JSON, things like that, XML going backwards and forwards. CSS, fonts. No flash or video in this case. What's interesting is how dominated this is by images. So in our case, it's actually 50% of the site is images and that's low. Most people will be up to 70% or 80%.
We're low on this because we actually have a lot of JavaScript on the site we load in. It’s highly optimized. We spent a lot of time trying to optimize it. It's actually a significant chunk. But most people you and when you run this on your site, you will see this as higher. The takeaway from that is and we'll come back to it later, images really, really matter. You can make a big difference if you just change how your images work. It's actually really basic to do that stuff.
So how does the website load? You've seen these before. Again, we'll go down the truck analogy route. So I'm a little bit adjusting this on the fly here. So we started the road up on the left hand side. We've established the connection, which is your DNS. Then you do the connection, then SSL because it's secure. At this point we download the HTML. So that's just the HTML page coming down. Then we have the CSS is the next line down and then we have the JavaScript the next line down from that. And then a couple of GIFs in this case. There are two or three GIFs and then an icon as well.
So these are your images. The interesting bits on this graph though is actually what happens over here, these vertical lines. So we download the HTML then here you can see the dom, so here you can see the start render. This is when the first paint happened and here you can see the domInteractive. These are essentially the two lines you want to shift as left as far as possible to shrink down to make your site faster.
This is actually a really simple example. This is the site for Hacker News. And in this case it has about six assets. When you look at your site, I guarantee you it will look more like this. This is one of our properties again. It's mixcloud.com. The main homepage, thousands of images coming in, hundreds of waveforms, assets loaded all over the place, but what you care about is these two lines again. When was it rendered? And when did it become interactive? Anything after the interactive part actually is as asynchronously loading stuff in for later on. It's not really relevant to the first user experience. We're trying to get a head start for later on in the game when they're sort of using the site. You want to basically take those lines and move them left. So in this case, the site as in visual, you can visibly see it in one point seven seconds. It was fully interactive in about 4.2 seconds. This is not atypical, it's pretty normal.
So once you've run these tests, what I highly recommend you do is go back to your test history. This is interesting for a few reasons. What you can do here is start comparing multiple websites and you can do multiple different scenarios. You can simulate them under 3G. 4G. You can run it from Dallas. You can run it from London. You can introduce packet loss which is the idea that the connection becomes really flaky. Potholes is what Tom would call it.
So what we'll do here, what we did here is I ran the 4G and 3G comparison that was the second slide I showed you with the videos of the two. This is really useful particularly when you put it in slo-mo mode, we can start seeing stuff coming on paid. You get an idea of what's coming in when and what you want to move above the fold or below the fold because if stuff is loading down towards the bottom, it's maybe not quite so important.

[00:10:12]
How to Make a Site Load Faster
So I've told you what a site consists of let's think about how we can make it load faster.
So where do we start? Like as Engineers, we always start by measuring things similar to marketers. So what's the best way to measure this? There’s four or five different methodologies and some of them are more accurate than others. I'll talk you through some of the quick like ones to get going with.
Point your engineers at this. There's something called JavaScript. It has a, so this is the navigation timing API in JavaScript, which will give you all the information you need from when a click has happened all the way through to this event has happened, this event has happened, this event happened. And in this case, what you'll see is like, you can measure from a user clicking. We looked at the DNS, we can measure the time of the response. We want all the very different time to interactive, time to first byte, all that data is there. What you need to do is just point your engineers at this and say please can you grab this for me? And please keep pull out the various the points we were measuring. The ones I would recommend you look at is request start, is your time to first byte, and dom loading, domInteractive, the other two interesting ones down there.
Google Analytics plots this out for you. And it's also available in site speed. I'm sure most of you have looked at this. We often have problems with this because what we find is historically, it's counted all the gump we’ve loaded in at the end of the page which aren't really relevant in the sites already fully interactive in that time. So I caution paying too much attention to this.
Pingdom’s also another great service which will deliver this as an email report to you. In our case we get it once a week and you can see that massive site problem about two weeks ago where people will get in page load times of 50 seconds and pages went off and people were not very happy. Again, it's got the same problem about 5.2 seconds in this case, but they're really good to get first baseline measure. And what you'll find is if you're not loading too many assets after the dom is fully ready or the page is fully rendered, these won’t be too accurate but you'll have to try them for yourself.

[00:12:04]
Where to Start
So we know roughly how to measure it. It's a bit of a dark art but torture Engineers about how to use that JavaScript navigation API. Where do I start? Like, where do I optimize to begin with?
Tools
And thankfully there's actually been some great tool that come out for this. This is probably going to be the legacy of Yahoo. It's the best thing they ever did for me anyway.
There's a plug-in called YSlow. How many people have tried this already? Bunch of you, cool. So YSlow, you download it. It's available pretty much in every browser. I just recommend the Chrome plug-in and when you run it on a website, it will give you a bunch of data, will give you an analysis essentially.
So we'll go through some of these in a bit. But in this case, it's telling you, you need to turn caching on.
There's more of these. Google-made PageSpeed and that's available as a website. You can go and look it up, huh, or you can use the Chrome extension.
What I generally would advise is try all of them and then have a look at their similarities. Have a look at all the different issues they will raise and optimizations they suggest and just start with the ones at the top on each one. You'll see they suggest 90% the same thing. Sometimes in a little bit they suggest that a little bit more subtle ways.
There's a third one which is called LightHouse, again by Google. This used to be a site. But actually I highly recommend you use the in-built audit tool in Chrome. So if you go right click inspect, audits and then run an audit on the website, it will give you some fantastic data. And it will give it to you live from your browser rather than run it from a server which is interesting because it takes into context where you are right now.
The issue with this however, and there's a few issues with the LightHouse which I'm going to flag at this point. The first one is it will always run it in mobile view. So if you want to analyze a desktop website, it becomes a little tricky to do it with. Sometimes that's good. But sometimes like always you have a mobile and you have a desktop website, we're not running responsive, so I need to run an analysis on both. It's very tricky when it insists to go into the mobile website every time.
But it does deliver this. So you remember the first paint I told you about, all the first content full paint. It will give you the renderings of these every time. So you can actually get a good idea of how your browser ran by slowing it down. Well it slows it down so you can actually see it. In this case I run this on an aeroplane. So the Ping time for Google was about 14 seconds. That's not normal, but it does highlight the fact that you can see this. It'll also give you a list of opportunities, places to go looking for things to tune.
These don't always make total sense in LightHouse, but it's worth having a quick look at them. Off-screen images. So in this case, it's suggesting images, which are off the page, why did you bother loading those up? You don't need them yet. The user doesn't need to see them. They can't scroll yet. So you'll get a list of these opportunities. You want to go down these one at a time probably taking them to your engineers going, can we fix this please?
But again, these will just give you hints and where to start. I'm going to give you some more general tooling there. So general areas, go and look and try to explain some of the most common things these tools will come back with.
What these tools show
So the first one is connections and requests or trucks as we now call them. So before you can open a connection, you need to know where your servers are. The stage really does creak a lot here.
So we use what's called DNS for this. You all run DNS. It's just the way the internet works. It's the domain name service. Most of you will, well DNS comes into categories, anycast and then the standard DNS essentially.
Standard DNS is where the DNS servers are placed in a single location in the world, which is great for people who are nearby because speed of light means that they'll respond really quickly. However, when the DNS server is miles away from you for example, as or in London the source of my that's a hundred and 60 milliseconds from here. That's just the speed of light. There's nothing you can do about physics in this case, but what we can do is you can cheat.
You can buy an anycast provider. Cloudflare is one of these. AWS. Route 53 is one of these, and so is easyDNS. This and what they effectively do is take your DNS entries and not just have all of them in one location, but store them in hundreds or almost a thousand locations around the world. So the user just goes to their nearest provider. In this case when I checked it out from San Diego last night, it’s about 20 milliseconds away instead of 160. That’s certainly about 150 milliseconds saved. These things start to add up and this is a very cheap quick win most of the time will be free. You just need to go and buy a provider which supports it.
As I say, there's three of them.
The next one is HTTP/2. As again, this has been covered in great depth by Tommy did a fantastic job of this. I'm just going to give you a real life example here.
So this is a recording of a website where the left hand side what you'll see is an image load which has been sliced up into lots of small images. So there it's going to load in about 200 little images to build an image up on the left using HTTP 1.1 and the right using HTTP/2. I'm going to see and time the differences between them.
So let's have a look at what the difference is. So HTTP 1.1, 1.5 seconds. HTTP/2, 1.3 seconds. So this is actually real time recorded from my hotel a couple of days ago and the results are astonishing that's five times improvement for just a simple upgrade. It's worth it. Find a way to get your engineers to move to HTTP/2. You won't always see a five times difference. This is somewhat of a contrived example, but you will see multiple times difference, multiple X increase.
And why does it work? Again, you're going to see this is try to again. I'm going to try and speed through this because hopefully you've covered this a bit already.
You'll see these six requests open up or six connections. This is normal. Most browsers will run six connections. The requests come in. They get responded to. The next six come in, they get responded to. Again, it's the truck analogy and HTTP/2, they all come in at once. We ask for all the images there in a single go and they come back almost like a shotgun. The whole lot comes back in one.
Just kind of give you a bit more depth on why this works. So HTTP/1 is what I keep referring it to most of the world does but it's actually 1.1. 1.1 came out about two years after 1 did.
But what happens is you open, once you've opened the connection it has what's called pipelining. So it can ask for a request. That will respond to it. I'll ask for another one and respond to it. So these are your trucks coming. So your trucks come and then new trucks go back again. And then they add pipelining so that's what was that in 1.1, and the catch here is that you can actually send multiple trucks in one go but the returning trucks must come back in the same order. So imagine you send a bunch of trucks over and actually returning one of them is a Ferrari, but it has to wait behind the other trucks.
So this is what HTTP/2 solves. It means that your responses can come back out of order. So what you don't get is you don't get blocked by the slow truck in front of you. You’re allowed to go past. So it means that you're fast requests come in and they're not queued behind the slow ones, which means that essentially you've got no blocking anymore. It's called head of line blocking. You don't have to wait in that queue for the person in front of you. If your orders get in quick, you can just bypass them. And that's why HTTP/2 fundamentally is so much quicker.
So I just, clicker doesn't always work. There we go.
You can check this as Tom said in the inspect to turn on protocol, trigger a bunch of traffic. You'll see those HTTP 1.1 or this shows HTTP/2 or H2 as it’s called. Sometimes you'll see something called QUIC, this is actually the next version of HTTP 2 which is already being tried by Google and Chrome. You can't really use it yet in your own services, but it's interesting when you see it nonetheless. I think Google Analytics runs that and Gmail and I think Google does as well don't come right now. It should give even further improvements we'd expect to that in, maybe in probably a few years from now.
What other benefits do you get? Header compression which basically is a freebie. Your headers which go back from the forward some to get compressed. You'd have to resend anything which was already sent you just as they were the same as last time. You have to prioritize stuff. So the browser can say that CSS I needed, I really need that before those images. So you can actually say what order you want stuff to come back in and server push. This allows the server to send content to the client for the very first time. My advice would be do not use this. It's not that it's not ready. It's that it's very very hard to get correct. And if you get it wrong, you'll slow your site down. You're unlikely to get it correct. It's very very hard.
Again, you can do other things and we had all these old technologies which were, things like sharding. Domain sharding. So this is the idea that because you were right limited how much content you can essentially get from a single connection simultaneously just open three to two, different domains and get it all the same time.
We’ll do sprites which was the idea Tom said where you put the image there and you move it around. Both of these are basically disaster under HTTP/2. They will slow you down rather than speeding you up. They'll actually get in your way. So, I can't recommend doing this and I can't recommend doing that.
What I would recommend you do is just design the web normally. Don't try and optimize for these kinds of stuff anymore. So any of the advice you see online saying to do this stuff, start ripping it out. It will just be slowing your site down.
Cool. So that's the connections. Hopefully having heard it from both myself and Tom it will now settle down you'll now go and try to get HTTP/2.
Static Assets
Let's talk about your assets.
So there's certain parts of your website, which just never change. You'll know what they are for the most part of the JavaScript the CSS and the images. For some of you lucky people amongst here, it’ll be the HTML as well. For most of us, we have to change it on the fly.
So what can we do about these? How can we make these fly? How can we make these go at a pace where they just appear in an instant?
So CSS and JavaScript to kind of the hardest ones to get right. So I'm going to talk to you about these first and then we'll go into images in a bit.
So I know none of you guys are developers and maybe a few of you are. But some of the technologies developers will talk about these days are bundle, bundling tools. So rollup.js is one of them, webpack’s another, Parcel’s another. You don't really need to know how these work but we do need to realize is what they're trying to achieve and why. You need to be able to identify when your developers aren't using tools like this and try and point them in that direction.
So what do we want to achieve from tools like this? What we want to do is take the code we generate with a CSS or JavaScript and we want to optimize it. We want to get it to a place where when the client gets it, it's sort of crammed and ready to go. So, let's have a look at it.
Again, there we go.
So JavaScript and CSS. I'm going to give you the example of JavaScript here, but this equally applies to both. We write JavaScript like it is on the left hand side. I'm sure you've all seen stuff like this before. But this is an optimally sensitive browser. And the reason is that there's lots of wasted space in here. And when we have wasted space, we’re sending a bigger fall down with more bytes, takes longer to get there, longer for them to download. This is emphasized by mobile. Then on top of that, it's actually a bigger file for the browser to hold in RAM, to hold in memory in to deal with so it’ll just run slower.
So what we want to do with that is we wanna minify and this is the idea, we take all the white space and get rid of it. We make it so human can't read it but machine is absolutely fine at reading this. And the second thing we want to do is take every variable. So these are like the little names we have around here, the keywords. We want to replace them with something like the letter A because props keys takes a lot more bytes to send them the letter A.
So these tools out there and the ones I just showed you will do this for you. They'll take your beautiful JavaScript and mangle it to the point where it's not really useful for a human. But as a machine, that's perfectly fine, and we've saved all those bytes in transit.
When you view source on a website, or you inspect or you look at the JavaScript, it will sometimes return it back to the original one. It'll say, using source map at the time. That's an indication that your developers have actually crammed it into the right one.
So they've actually minified and concatenated all properly, but they've given you the tool chain necessary to unpack it again. So you as a human can see it properly. What you want to see ideally is the right hand one or use source map come up.
Finally, you might hear people mention tree shaking. Again, this is a tool chain we use as developers. This is a fantastic idea. Thankfully. Well not thankfully. Regrettably, there's no really good image for this but the concept is simple. When I write JavaScript, I actually write a bunch of JavaScript which is no longer relevant. I just forget to remove it. I left it in there. So why do we need to send that to the client? We don’t is the simple answer. So what tree shaking is is this concept of when we build our JavaScript ready to put it on the website live, we just look at it and analyze it and remove all the dead code. And we do automatically so we say this function, this bit of code is never called ever so just get rid of it.
And when you use modern tool chains, modern bits of JavaScript, there's this tendency just to import the world and dump it all on there, but actually when you're only using 10% of it, 90% of it is a complete waste of resources. So tree shaking is something you might hear them mention and I'd highly recommend you kind of did a little Googling or talk to them about it. But if they're not using it yet, it's a great way to crush that stuff down.
Finally, this is actually good. So what you'll see is most modern web sites in there are splitting up their JavaScript and CSS. So we don't do this right now because it's actually really really hard to do correctly. But once you've got your JavaScript in your CSS, and you've crushed it all up into a single file, what we don't want to do is send that down as one file because you don't need all of that CSS for every page. You leave some of it.
So what people will do is they will actually split them up into lots of little files. When you combine this with something like HTTP/2, you've essentially got that shotgun approach you're sending all the files in one go and you'll transmit your JavaScript to the client significantly faster than you would have done before.
So what you want to do is you want to see your JavaScript and CSS coming in as lots of little files on one big one.
GZIP. Compression. So this is like your zip files on your laptop. You want to compress your files and send them down and there's a really quick way of doing this. If you run the, in this case, it will tell you the files which are wrong. I love the fact that every single website I Googled to see how they did this. If I Googled something about caching or compression, they had a great example of them doing it wrong. Every single site without example without exception.
Say, what you want to see here is one of the head is coming back, pick your file, have a look at it, you should see content getting GZIP. That just means it's been compressed. If you don't see that, you need to tell your developers to get that enabled. You usually see a 2 to 3 x instantaneously.
Again, here's our example. So JavaScript went from one megabyte to three megabytes and we don't compress that. This is pretty typical.
Caching. Tom went over this earlier. So I'm going to fly through this pretty quickly, but it's the simple idea of, if you want an asset, so in this case foobar.css, you get it from the server and on the way back from the server, you say, please keep, this one is accurate for an hour. Every time I ask for free bot from CSS for the next hour, it's going to get it from the local server, the local computer server, and going to the server again. This means we never have to make that request, it’s automatically there. You want to do this as much as possible. Sending nothing is always better than sending something. So, how do we do it? How do we enable it?
Firstly, let's see whether it's enabled or not. So you again use YSlow, click on the inspect tab, and what you'll see is add expires header. This basically means, tell the browser when this file expires. If it's not there, you need to go fix it. In this case again, the site you’re caching didn't have it enabled.
What techniques do you want to do for this? Firstly what you want to do is make sure every file has a very very unique name. You don't want mycss.css because when you update it, you need that name to change.
One technique is called content hashing, which is what we use here where each file name is essentially a random string which is calculated based on the content of the file. So we change one single byte, one single character in the file, that name will completely change. Means that every single version is updated automatically.
So once you got unique names the other idea is to put question mark. Vehicles one on the end is a query string, vehicles 2, vehicles 3, but they get out pretty quickly. I highly recommend looking at this.
Once you've got unique names what you want to see coming back is cache-control public. So you can sort publicly don't just throw it privately. So if you're ten years, why not? It's never going to change. In our case Cloudflare cached the latest. It was on the CDN as well.
You also want to see it coming from memory or from disk. Both of those is good. You don't want to see it coming over the network with the size. You just want to see it there instantaneously. So the thing you're looking for is size, either from memory cache, or from this cache. You can't optimize which one it comes from, either is great. It's still faster than going over the network.
Once you've got this set up, CDN will work instantaneously. They'll just, your goal in basically. You can stop putting all this content through a CDM and that will move it from that server to the local server type user. So again, you'll be minimizing your latency. You'll see you'll be taking these which took, in our case 160 milliseconds from here to London, to here to somewhere in California. I can’t tell you where the servers are in California but at 20 milliseconds away, it's pretty close.
So use a CDN, they will solve a bunch of other problems for you. I don't really have time to go into right now, but they'll do SSL termination rights. Really hard to get that correct from an optimization point of view. DDoS protection. So when you've got people trying to blackmail you or take your site offline, will deal with that for you. Having used these guys a bunch to help us with that, they're fantastic. HTTP/2, the roll layout for you, Tom covered that. Compression as well. Some of the times they'll do this minification question for you.

[00:28:41]
How to Optimize HTML
So let's go through some of the other things. So let's talk about HTML for a bit. How do you optimize HTML? Small changes here can actually have profound differences if you get them correct. They’re somewhat subtle, but they're actually pretty easy to consider.
So the first one is when you look at anything, which is a list or any kind of iteration, what you don't want to see is extra elements on the page. So you don't want your dom, document object model or HTML essentially to be overly complicated.
So we see something like this where you've got a div wrapping a list element and if you could just do it in one element a list, do it. The browser doesn't have to think about that so much and therefore it will run faster. Scrolling performance will go up. You'll hit 60 frames per second. And beyond that, it's smaller, it’ll download quicker. It's actually a pretty quick way which will make a big big difference.
Secondly your pages will load faster and you can optimize this as well by slicing your content up. What content needs to be at the top? What content can actually come down at the bottom? So when you're viewing these pages loading slowly, and in slow mode via the web page test tool. What you can do is you can start seeing how different sites optimize, where they'll bring in some of the content because this is actually the first key bit of content. Then they'll slowly load in the other bits in in the background using JavaScript. They'll be bringing them on the page slowly. And this is great because what you'll see is your site now works in 1.3 seconds instead of 1.5. The users getting an idea that you're getting going.
Actually perception is the name of the game here. I can highly recommend just doing this slow motion on your site and see how it loads and if it snaps in one go, there's room for improvement.
Images. Images are still the biggest problem in the biggest way. So when you opt, when you analyze your site the first thing you will probably say is you need to optimize your images. The good news here is this is actually really really really easy. In this case, this was actually, what side did I do this on? Things Google developers. They were saying that they are two images to save their and if this was the internet is hard to come. This is pretty normal like every website I looked at had multiple megabytes to save there time and time again.
All you need to do here is remember two things. Well, three things. Get the right type of image for the job. There's two types is essentially vector and raster images. Raster images are pixel-by-pixel layouts. JPEGs. PNGs. Vectors are things work in curves and shapes. Things like logos are a great example, just choose the right one.
So in this case, here's a great example of a vector image. This is our logo. In this case, we've used SVG which is a type of format for vectors. Every single Photoshop sketch, all the kind of tools your developers will use will be able to output these and your developers, I don’t mean developers, your designers, sorry, will be able to give you the right assets or they'll be able to put them on the website for you. You just need to point them in the right direction. Tell them to go and do it.
If you can use a vector it will always always always be faster.
How do you choose it? Here's a simple diagram. If you get into the animation you need to use GIF. If you want to preserve detail, you need to go down the PNG route. If you can deal with compression loss, JPEG. Try it. Take your images. And when the tools tell you to have a look at a different format, sometimes they will even give you the images in those different formats. I believe PageSpeed will do it for you. Try it and just see which one is bigger and go with the one which is smaller and yet suits your needs. Some of them might blur a little bit, maybe that's not suitable for your use case. Just try it and see.
WebP is the next standard that’s coming online for this now. It's not quite ready yet as in it's ready to go but it's Chrome only. Other browsers don’t support it and it's actually quite hard to get online. But if you're interested in it, it will give you about 30% saving and most images in terms of size. It's a bit of a pain to set up because you still need support webpage and you need support the older formats like jpeg as well. I can't really recommend it for mass use right now because it is a bit tricky to configure.
Make sure your images are the right size. This is where I see the biggest waste by far. So if you imagine an image as a 100 x 100 pixels and you load in one which is 200 by 200 pixels genuinely you're wasting four times the bandwidth you need. Because it's a square so you scale up both sides. Have a look and check that your images match the hole they're filling on the page and if they're not, ask why. The most common example or common reason given is I put them in twice as big so they work on retina screens. There’s solutions to this. Like the source set is a great way of doing this on images. It's fully supported by I think every browser. But Opera Mini right now. I have to go and check that tell your developers stop being lazy and use this instead.
These slides will be available later. So you'll be able to get this.
Then what are the tricks we do? Some of these we do all the time and they work fantastically. We’ll speed through these because once you get the idea, you'll get the concept. Can we just kid people that the site is loading faster? Not actually make it fast but make the perception of the site faster? First thing to do, progressively larger jpegs. So this is a different type of JPEG but instead of loading top to bottom it loads low-res to hi-res. So you actually start seeing these render really quickly, but it's not fully complete yet. Just called progressive jpegs. WebP also supports this I believe.
I love this trick, which is just loading a color-block representing the image in advance. Google will do this, Facebook do this other trick, which is take your image compress it down to be really really small, blur it and scale it up. So what this does is they load in a 200 byte copy of the image before they load in the full 70k image. So this is a trick they use on their mobile apps the whole time. iOS and Android do this everywhere. But again, it gives you this perception of speed as these blurs come in and then the actual image loads in later. And this is very hard to get right and cost a lot of engineering resources, but it's neat to see what people can do here.
Let's talk about CSS a minute. Again, caching, it's again the trucks. It's always the trucks. Check the caching is set up right for your CSS. I'm not going to go into this anymore. You might see this slide one more time.
Optimize your CSS includes. So CSS, you want to be putting it here right above your head. That means that the browser can load this instantaneously. Soon as it starts passing top to bottom, it can start loading the CSS and getting underway. You don't need to delay it, get it towards the top so it knows to go and start fetching stuff.
You used to be able to see what CSS isn't needed. You can't anymore, Chrome removed this. It was always a pain to get it right. So sadly that's gone. There's a couple of tools down there. If you've got any excess CSS on your website, get rid of it. Like if you're sending stuff to users they do not need, just remove it. It's quite hard to find the right stuff but spend the time looking for it.
This is a more advanced technique where you actually calculate on-the-fly the CSS necessary to render each page and you just send down just that CSS for each user specifically targeted. It's hard to get this exactly right but if your developers are using React I highly recommend you check out style components. This will probably only apply to 10% of you in the room. But if you can pull this off, this is astonishingly fast.
JavaScript, let's see. What we can do here. This is actually where I think a lot of big ones are. Caching. Do caching. Trucks again. You could see a pattern here.
So this is what most people do with the JavaScript they'll put it just above the head and actually a lot of third-party scripts recommend it it's really really bad and it will slow stuff down. The reason for this is is your browser starts at the top and goes, I need to run this script now and it'll start downloading it and not only downloading it but will insist on running the thing before it will continue rendering the page. You need to stop doing this. If you see your developers doing this ask them why.
There's two solutions to this. The first one is to move it just above body. So move your JavaScript right down to the bottom of the page. What this means is that the client or the browser Chrome for example can render the rest of the page before it gets blocked by this JavaScript. Soon as it sees JavaScript essentially stops rendering, runs the JavaScript, figures out whether the JavaScript is going to screw with the page, then continues rendering the page. What you want to do is delay that as late as possible and the solution to this is quite simply move it to the bottom of the page.
The second trick is to move make it async. This is supported by all if not most major browsers at this point and it's very very simple. You add these 5 characters down here, async. And at that point it’s okay to put it now in the head of your page. And the reason this is now okay is because it knows to defer in that you're not going to change the page so it can continue thereon in.
This is one mainly for you guys. Remove all those tracking pixels everyone you can get away with get it off the page. I would highly recommend you do a review once a month or once a quarter and say what is running on our site and what can I get rid of? These will slow you down more than anything else out there. We fight ads continually. We rewrite ad code every single time an ad network goes on our site. We have to rewrite that code to make it faster. Try and optimize the stuff. Try and get rid of it. That's the best thing to do. I know you're not gonna be able to get rid of all of it because a lot of you use it for your jobs, but have some common sense. Find the ones which you don't need and get rid of those.
So in summary, there's basically four techniques for all of this. Reduce the number of things you're sending down to the client. Just reduce them. Try and get rid of as much as you can. Reduce the size of the things make everything you can smaller. Make it nearer to the client and make it run faster. Prioritize the order. Put the JavaScript at the bottom, CSS at the top. Get into roughly the order the client is going to need it. Oh there’s five, you say. And yeah, it might not be your job. But hopefully now you have a bit more of a conversation with developers about how you want to order things, how you want to prioritize speed in your roadmap and what you guys can do to help them.
And thanks for listening.

[00:38:19]
Host:
Fantastic. So take a deep breath. Any questions?
Okay, one right in front. Hello. There's a microphone. Incoming mic.

[00:38:33]
Man from audience:
So all the website performance optimization information that you can find on the web. They're always telling you, you need to be at 3 seconds or less, right? So when you're troubleshooting or you're looking at the network tab, just seeing you know, all the files and everything. It has three time speeds. Document loaded time. Finish and then it has another one. Which of those is the definite. Yeah, we never use the name for.

[00:38:56]
Mat Clayton:
That's a great question. We tend not to use the network tab. And the reason for that is that, if you're caching stuff properly, I'm assuming you're doing that, essentially what you want to do is flush that out to begin with. So the best way to do that is using an external third party like webpagetest.org because they will ensure that the cache is cleared and it's kind of the worst case scenario for you. So we tend to not use the local network tab for that reason, we tend to use these third-party tools. You can too.
But honestly you can end up with plugins installed doing weird stuff. You can do with service workers running doing weird stuff or more likely the cache is not cleared. So what I would recommend doing is using webpagetest.org, which I suggest that because it's just sets you on a level playing field before you start and then you're looking at those two vertical lines we talked about. We specifically try and move the first paint one as left as possible because the JavaScript one for us, there's lots of tricks in the background to try and get it ready for page 2, page 3, page 4 as the users clicking. Does that help?

[00:39:49]
Man from audience
Yeah. Thank you.

[00:39:53]
Man from audience:
So I have two questions. One was in regards to the HTTP/2 or HTTP 1.1, was the limitation of essentially six trips at a time and you're saying that HTTP/2, is there actually no limit or is it just like just a really high limit?

[00:40:10]
Mat Claytin:
So HTTP 1.1. It's not that the connection has a limit all the browser's just open six. So they open essentially six roads or six connections. It's not an official limit. It's just every browser has implemented it the same way. I think they did some testing in the early days and kept with it. I'm not seeing any limits on HTTP/2 yet. I'm not aware of any, we've seen sites come down with hundreds in parallel. There might be a theoretical one, but it's not relevant.

[00:40:36]
Man from audience:
Okay, and then second question was in regards to time to first byte. I've noticed a lot of times on CDN you'll see a larger time to first byte even when the actual page load is faster, but you see these almost like warnings come through on things like the web page speed, was a web page speed a word, or whatever it is. And also Google's tools. They'll warn you that, hey your time to first byte is taking way too long. For example, I have a landing page. It's coming in at about 3.6 seconds on a 3G.

[00:41:10]
Mat Clayton:
On the base speed. On the HTML or on the assets?

[00:41:12]
Man from audience:
On the dom. The actual you know, and but the time to first byte of that 3.6 seconds, the time to first byte is like 2.2 seconds. If I disable the CDN the time to first byte comes down dramatically, but the actual page load time takes longer.

[00:41:31]
Mat Clayton:
Yes. I need to have a look at it to say exactly what's going on. My suspicion is your CDN, what you're using is not doing streaming. So what this is is the idea that when the response is ready from the far end, it doesn't all have to go to the CDN provider, wait for the whole thing to download and then send it out to you. So the older CDNs will do that. So they buffer it all essentially and then send it to you. So impact your ttfb quite badly because it waits for all to download and then sends it. The more modern ones are actually streaming through. How floods streams through, I think fastly does as well. The other two I checked I'm not looked at any others. I have seen them. They do this buffering before it particularly impacts as with music. So you can imagine we have a 50 megabyte file where the worst-case scenario is you have to get all that 50 meg through before the user starts hearing anything. So we suffered from that pretty badly for a while and Firefly tool doesn't do that.

[00:42:21]
Man from audience:
Okay. So after I’ll get a hold of them, but that's something we are experiencing in.

[00:42:25]
Mat Clayton:
Is it with Firefly?

[00:42:26]
Man from audience:
Yeah.

[00:42:27]
Mat Clayton:
I’ll talk to them about it.

[00:42:28]
Man from audience:
Okay. Thanks.

[00:42:25]
Host:
Fantastic, that's all we have time for Matt. But Matt, thank you so much for being here.

Related Videos

Premium Lessons in Growth Marketing, From Watching Romantic Comedies Cover Image
Lessons in Growth Marketing, From Watching Romantic Comedies
Alexa Hubley
Conference Logo
Premium From Organization-Centric to Customer-Centric Cover Image
From Organization-Centric to Customer-Centric
Dana DiTomaso
Conference Logo
Premium How To Incite Hunger Rather Than Just Serving Food Cover Image
How To Incite Hunger Rather Than Just Serving Food
Rand Fishkin
Conference Logo
Premium Featured Snippets: Then to Now, Volatility & Voice Search Cover Image
Featured Snippets: Then to Now, Volatility & Voice Search
Rob Bucci
Conference Logo
This website uses cookies to optimise your online user experience. Some of the cookies we use are essential for the site to work.
By continuing to use our site you agree to us using cookies in accordance with our Cookie Policy.
Accept