Amazon AWS Certified SysOps Administrator Associate – CloudFront Part 2

  • By
  • May 23, 2023
0 Comment

4. CloudFront Caching – Deep Dive

So here is a deep dive lecture on CloudFront caching. And on CloudFront you can cache based on multiple things and we’ll do a deep dive on all those. So headers, session cookies, query string parameters, and based on the combinations of what you set for these parameters, the cache is going to be less or more efficient. Now the cache will live on each Cloud Front’s edge locations around the world.

So the idea is that the client will make a request into an edge location, and if the edge location has the data in the cache, it will serve the data from the cache based on the headers and the cookies and everything based on the TTL of what’s in the cache. And then if it’s not from the cache, then it will forward the request to the origin and retrieve the data sent into the clients and cache the results. So you want to maximize the cache hit to minimize requests on the origin. This is the whole idea behind using a cloud front distribution and you can control the time to live.

We’ll see how using a header, and there are two headers you can control. There’s the cache control header and there is the expires header, and you can invalidate parts of the cache using the create invalidation API. So now let’s do a deep dive into the caching behavior for each of the following. So the first is headers. And so when the client passes an http request to your clockwise distribution, it passes headers with it, and headers is what you see in this diagram. So it’s the combination of host and in a value user agent, and a value date, and a value authorization, and a value keep alive, and a value accept range and a value, and the headers can be really whatever you want.

So saying get this image cat JPEG using the protocol http one one, and here are the headers that I want to pass to you. So you can configure cloud Front in three ways. You can forward all the headers to your origin, that means there’s going to be no caching, every request will go to the origin. And so effectively you’re not using cloud front for caching if you forward all headers. And in this case the CTL must be set to zero because you’re not doing any caching in cloud front. If you forward a white list of headers, that means you only forward some headers as part of this request, then the caching is going to be based on all the values in the specified headers.

And if you do none, you forward zero headers, then the forward is going to be only done for the default headers, okay? And it’s going to be no caching based on the request headers. And it’s going to give you obviously the best caching performance because the headers are just removed from the request. So whether or not you want to forward all headers, wet list header, or if you want to forward zero headers, okay? It’s up to you and to what your application does. Maybe your application needs some headers or maybe it doesn’t need some headers and it’s very important for you to understand that it’s more of an application specific behavior. But here I’m explaining how Cafront cache works, okay?

So if you have a look then we have a request that’s being made with a lot of headers, okay? And then this is going to have a white listing in this example. So we’re going to white list a few headers, we’re going to whitelist the host and the Authorization header. And so these are the headers are going to be passed on to your origin and then the cache will happen at this stage. And so what will happen is that because you are passing less values from your clients all the way to your origin and less values of the headers, you’re going to have better caching because you have less values to cache. Okay? And the idea is that if the same request comes with the same headers, then Cloud Front will know how to respond to that request directly. So it’s up to you again to see how you want the behavior to be done. So there is a big question that can come up with the exam which is talking about Cloud Front, Origin Headers and CloudFront Cache behavior.

So there is Origin Custom Headers and this is a setting you can set on your origin itself. So that means that for every single request, this is an origin level setting and it will set a Custom Header name and header value for all requests made to your origin. That means every single request, no matter what comes with some headers and cloud fronts will add on some headers of its own that you specify right here. They’re called Origin custom Headers. A use case for that would be for example, if you wanted to tell your origin that a request was coming from CloudFront then you could define a Custom Header for this if you wanted to. So this is going to be custom and constant no matter what.

Okay? But you have behavior settings and behavior settings allow you to set your whitelist of headers and this is something you set at the behavior level and that’s why it’s called the Cache behavior because this is going to be used for caching. The first Origin Custom Headers are not used for caching, okay? They used to just pass on headers to the origin but the second one is a cache behavior.

So this is cache related settings and it will contain a list of all the headers to forward to your origin and to cache onto. As you can see in my example I passing on the Clare fronts is Desktop Viewer or Clare Front is Mobile Viewer as headers that are going to be whitelisted and passed on to my origin. So next we have the caching TTL so as I said, the origin has to respond with a header if you want to. So the header can be Cache, Control Max age.

Or there is an Expires header as well. But the newest standard and the best way to do things in Cloud Front is to use the Cache Control Max Age header when replying from the origin to Cloud Front. And if the origin always sends back the Cache Control Header, then you can set the TTL to be directly controlled by the header and therefore by your application.

But in case you wanted to set Min and Max boundaries for the TTL, you can choose customize in the object caching settings. So at the again behavior level setting, okay? Because this is caching behavior, you can say for object caching, use Origin Cache Headers. In this case, your application will set the cache no matter what or you can customize it to have a minimum TTL, a maximum TTL and a default TTL. And the idea is that if the Cache Control Header is missing from a reply from your origin, then it will default to default value. So if you have a look, this is the TTL, okay? There’s going to be a Min TTL no matter what, a Max TTL no matter what, then your application will return a Cache Control Header.

Okay? And this is optional but recommended. If the Cache Control is less than the Min, obviously the Min is going to be used. If the Cache Control is more than the Max, then the max is going to be used, okay? And if the Cache Control is missing, then the TTL that will be applied is the default TTL that you set up in these settings. So hopefully that helps you understand how caching works and how the TTL works. So now we’re going to have a look at cookies and query string parameters.

But the idea behind it is quite similar. So cookies are what? Well, cookies are a specific header, but in the header named Cookie you’re going to have a lot of key value pairs. So in this example have username equals John Doe, location equals UK, length equals Eng, and user ID equals 12342. So the idea with this is that you passing four cookies as part of this request and cookies can be, again, have three different settings.

So the default is to not process the cookies. That means that the caching will not be based on cookies and the cookies are not going to be forwarded from Cloud Front to your origin. Or you can forward a whitelist of cookies, in which case the caching will be based on the values in all the specified cookies. Or you can forward all cookies, which gives you obviously the worst caching performance but allows your application to use all of them.

So again, what you set depends on your application. How are cookies used within your application is very important. And then you would set up cloud front accordingly to have the best caching performance. So it’s going to be the same idea here. So here I have a request and we’re going to whitelist user ID. So my origin is only going to receive this request with a cookie user ID closed 12342 and then this is what’s going to be forwarded and it’s going to be better. Caching because you have less cookies values from what is being sent from the client to cloud Front and what is passed on from Clap front to your origin. Similar behavior for query string parameters.

So if we have a look at this, this is a get and this time there is a question mark and there’s border equals red and size equals large, okay? And this is going to be always in the URL. So you have three options. Default is to not process them. That means they are not passed to your origin. And Caching is not going to be based on these query strings. Or you can forward a waitlist of query strings and then the caching is going to be based on this white list. Or you can forward all query strings and then you’re going to have caching based on all parameters. But again, this is going to give you the worst caching performance because you have many more values. So as you can see, three different concepts.

We have headers, we have cookies and we have query string parameters, okay? But the idea is that the settings are very similar and the behavior is very similar. So in this example well, I just white list the size query string parameter so the size only is passed to my origin and then my origin can make a decision based on that. And you get better caching because you have less query string parameters values. And I made a small typo but I will correct that in the PowerPoint. Okay? So finally, how do you maximize cache hits? Well, you can separate static and dynamic distributions.

So say you have a classroom layer, all the static requests should go to cloud front into a static content s three bucket. And here you don’t use heads, you don’t use headers, you don’t use cookies, you don’t have caching rules, okay? And you’re going to have maximized cache hits because, well, the content is coming back from S three as static but then all the dynamic content, all the stuff that can be dynamised and then you pass onto your application running maybe on your ALB plus easy to instances or API, gateway and Lambda.

Maybe they will be using some headers and some cookies. Then you configure your cloud front distribution just like I just told you from the previous settings, okay? And you pass on exactly what’s new to your application and again you would maximize cache there. So it’s quite recommended to split these two together. So to summarize to increase the cache hit ratio, well just have a look at the CloudWatch metric specify how long the objects should be in your cache.

So use the Cache Control Max Age header specify none of them or the minimally required headers specify none or the minimally required cookie and same for the query string parameters. And finally, separate the static and dynamic distribution. So that means you can have two origins for your Claphone distribution. Okay? So I hope that helps. I hope that’s helpful to you. I hope you understand what I mean behind Caching. This will help you answer one or two questions at the exam. I hope you liked this lecture and I will see you.

5. CloudFront with ALB Sticky Sessions

So here is a short lecture on how to use cloud front with your ALB when you have enabled Tiki sessions. So say you have an application balancer and a target group and you have enabled Tiki sessions so that you want the same request from the same user to go to the same backend EC two instances. So you’re set up front with an edge location and you want the two to work together. So the solution is to forward the cookie that controls the session affinity to the origin so that the session affinity still works because if you don’t forward the session the cookie then obviously it’s not going to be passed on to the ALB and then your session affinity is not going to work. So concretely the user will do a get and will pass on a cookie and maybe you’ve set up the default cookie.

So AWS ALB equals whatever, then cloud fronts is going to be doing a whitelisting on this cookie, the AOS ALB cookie, then the cookie is going to be forwarded to the ALB. That means the ALB will see the cookie and therefore it will know to send the request to the same EC two instance all the time for the same user. So that if another user has another will pass on a request but this time the cookie is still named. Of course it always ALB but the value is going to be different because it’s a different user. Then that request again will forward the value of that cookie and the ALB is going to forward that to another easy to instance and again it will be a sticky session for that user.

So this is an important setup to see because it makes sense once you see it but it’s a trick to know obviously. So if you’re using the ALB with sticky sessions and confront, please set a whitelisting on all cookies or the cookie that is controlling the session affinity and also as a security measure you should set a TTL for your cache request to value lesser than what the authentication cookie is. But this is just very very detailed and it’s not something the exam will test you on. So that’s it for this lecture, I hope you liked it and I will see you in the next lecture.

Comments
* The most recent comment are at the top

Interesting posts

Impact of AI and Machine Learning on IT Certifications: How AI is influencing IT Certification Courses and Exams

The tech world is like a never-ending game of upgrades, and IT certifications are no exception. With Artificial Intelligence (AI) and Machine Learning (ML) taking over everything these days, it’s no surprise they are shaking things up in the world of IT training. As these technologies keep evolving, they are seriously influencing IT certifications, changing… Read More »

Blockchain Technology Certifications: Exploring Certifications For Blockchain Technology And Their Relevance In Various Industries Beyond Just Cryptocurrency

Greetings! So, you’re curious about blockchain technology and wondering if diving into certifications is worth your while? Well, you’ve come to the right place! Blockchain is not just the backbone of cryptocurrency; it’s a revolutionary technology that’s making waves across various industries, from finance to healthcare and beyond. Let’s unpack the world of blockchain certifications… Read More »

Everything ENNA: Cisco’s New Network Assurance Specialist Certification

The landscape of networking is constantly evolving, driven by rapid technological advancements and growing business demands. For IT professionals, staying ahead in this dynamic environment requires an ongoing commitment to developing and refining their skills. Recognizing the critical need for specialized expertise in network assurance, Cisco has introduced the Cisco Enterprise Network Assurance (ENNA) v1.0… Read More »

Best Networking Certifications to Earn in 2024

The internet is a wondrous invention that connects us to information and entertainment at lightning speed, except when it doesn’t. Honestly, grappling with network slowdowns and untangling those troubleshooting puzzles can drive just about anyone to the brink of frustration. But what if you could become the master of your own digital destiny? Enter the… Read More »

Navigating Vendor-Neutral vs Vendor-Specific Certifications: In-depth Analysis Of The Pros And Cons, With Guidance On Choosing The Right Type For Your Career Goals

Hey, tech folks! Today, we’re slicing through the fog around a classic dilemma in the IT certification world: vendor-neutral vs vendor-specific certifications. Whether you’re a fresh-faced newbie or a seasoned geek, picking the right cert can feel like trying to choose your favorite ice cream flavor at a new parlor – exciting but kinda overwhelming.… Read More »

Achieving Your ISO Certification Made Simple

So, you’ve decided to step up your game and snag that ISO certification, huh? Good on you! Whether it’s to polish your company’s reputation, meet supplier requirements, or enhance operational efficiency, getting ISO certified is like telling the world, “Hey, we really know what we’re doing!” But, like with any worthwhile endeavor, the road to… Read More »

img