Google Associate Cloud Engineer – Other Important Services Part 2

  • By
  • August 24, 2023
0 Comment

5. Step 05 – Getting Started with Cloud DNS

Welcome back. In this step, let’s talk about Cloud DNS. Think about this, you would want to set up in 28 minutes. com website. What would be the steps in setting up a website with a domain name? The first step would be by the domain name in 28 minutes I can go to a domain register. Like GoDaddy. Even Google Cloud offers domain registration services. So I can go to one of the domain main registers and say okay, I would want to own this domain name in 20 minutes. The next thing I would need to do is to set up the website content. If it is static content, I might host it in Google Cloud storage. If it is a dynamic web application, I might want to use App Engine, or I might have actually a Kubernetes cluster where my application is running it.

So there are a number of ways I can set up my website content. So that’s what is called website hosting. I can run it in a Compute Engine or App Engine or Kubernetes Engine or any of the other options that Google Cloud supports. This can also be an external host as well, something outside Google Cloud. Step three is to route the requests that are coming in from in 28 minutes. com to the place where my website is hosted. So I would need to say whenever somebody types in in 28 minutes, we need to start serving requests from this web hosting. How can we do that? That’s where we make use of DNS domain name system. And Cloud DNS is the DNS service which is provided by Google Cloud. So Cloud DNS is a global domain name system.

It is something which we make use of to do this. Step three in here when a user sends a request to Infiniteminutes. com, we would want to map it to a host server. We would want to map it to an IP address. And that’s where Cloud DNS comes into picture. So it is used to set up DNS routing for your website, for example, in 28 minutes. So you can say any request which comes to API invente minute needs to go to the IP address of the API server. Let’s say I have a full stack application where I have APIs in the back end and I also have static content, a front end application in the front end. I can say API inventedmits. com needs to go to the API server and I can say static inventedmits. com needs to go to the IP address of the Http server.

Maybe this is cloud storage. Maybe this is App engine. You can also set up routing for your emails. So if somebody sends an email to Ranga@inventedmits. com, it needs to be sent to this specific mailing server which is installed at this specific address. Cloud DNS supports two kinds of zones. Whenever you want to create records like this, what you would first do is we would create a zone. So we create a zone and inside the zone we would create multiple records. So a zone is nothing but a container for records. And in cloud DNS you can actually create public and private managed DNS zones. Public are exposed to the Internet.

Things like in 28 minutes I would want everybody to be able to type in intentmints. com and go to my website. In that kind of situation I would make use of public managed DNS zone. However, in certain situations I would want an internal DNS, something which is available in a specific VPC or something which is accessible only from a specific subnet. In those kind of situations I can go for a private managed DNS zone. Let’s go to DNS. This is Cloud DNS. You might need to enable the APIs for cloud DNS if they are not enabled already. As you can see in here it says highly available global DNS network.

Let’s go and say enable Google. Cloud DNS is a scalable, reliable and managed authoritative domain name system. It is running on the same infrastructure as Google, provides you with low latency, high availability and a cost effective way to make your applications and services available to your users works. The APIs are enabled so we don’t really need to read that anymore. Let’s go in and over here. Once you are here in cloud DNS, the first thing that you would do is actually create a zone. So if you don’t want to create records like this so I would want to create records like API in 28 minutes. com needs to go to the IP address, to a specific IP address.

Before I’m able to create the records I would need to create a zone. So let’s go and say Create zone. And over here you can either choose to create a public zone or a private zone. If you’d want to create a public zone, you need to be the owner of that specific website. So if you want to create a public zone for inventory atmills. com, you need to be the owner for inventory. com. The other option is to go to a private zone. So over here we just give it a name. So I’ll say my private zone and I’ll use the same thing as the DNS and I’ll give it a DNS name as well. I’ll say my private zone. Typically this is where you specify your if it’s a public zone, this is where you specify your website name. Something like www. intermediates. com.

You want to create a private DNS zone. Or here you’d be able to choose the networks which you would want to use the DNS zone from. Because it’s a private zone, I will need to choose the VPC from which I can use that specific DNS zone. So I can say I would want to use it from default or I can say I would want to use it from my first VPC. I’ll say my first VPC is where I would want to use it from. This is the VPC. This is the VPC which we created earlier in the VPC section of the course. And now I can go and say Create. And this would create our DNS zone. Once we have a DNS zone, you’d be able to go in and add records. So I can click Add Record set and you’d be able to see that there are a wide variety of record types that I can make use of.

So I can say if somebody actually sends a request to API my private zone, then I would want to route it to a specific IP address. Or if I would want, if somebody sends a request to frontend my private zone, then I would want to be able to send them over to a specific IP address. So DNS is just a way to manage the mapping from a name in 28 minutes to an IP address. To be able to do that, the first thing that you would do is to create a zone. Zone is nothing but a container for records, and inside the zone you would create records. You can also do cloud DNS things from the CLI as well. Let’s quickly look at how to do that. So the first thing that you need to do is to create a managed zone. So whenever I would want to do something with DNS, first thing I would need is a managed zone.

So let’s go ahead and create a managed zone so it’s G cloud DNS manage my private zone. For example, you need to give it a description. You need to give it a DNS name. Earlier we said my private zone is the DNS name and we will also need to say visibility. Either it’s private or public. Public is the default. If you want, you can also make it private. If you make it private, then you can choose the networks. So you can say Hyphen Hyphen networks and list the VPC IDs of the networks that your zone should be visible from.

Once you create the menu zone, you can add records. And adding records to cloud DNS is not easy. It’s a threestep process because DNS changes can be very, very sensitive and you might want to add multiple records at the same time. This follows something like a transaction approach. So you start a transaction, make the change, and then commit a transaction. So you’d see commands similar to that. So three steps to add records to manage zone start a transaction for the zone. So you’d say Gcloud DNS records, it transaction start, so I’d want to start a transaction. Then you would go ahead and make the changes. I’d want to add a specific type of record. I’d want to add an A record or a C name record and end the transaction.

Now, the exact commands are not really important in here. The important things to remember are when you want to map a name to an IP address, DNS is the way to go. In DNS, the important concepts to remember are many zones and records. If you want to have a set of records, you need to first create a mandate zone. Inside that, you would create the records. And if you want to add records to a mini zone from the CLI, then you need to have transactions. You need to start a transaction, make the changes. The changes might be multiple, so this is one change. You might want to add other records, make all the changes you’d want to make, and at the end, in the transaction, I’m sure you’re having a wonderful time, and I’ll see you in the next step.

6. Step 06 – Getting Started with Cloud Dataflow

Welcome back. In this step, let’s look at cloud data flow. Cloud data flow is a very very difficult service to describe. If you go to Dataflow, type in Data Flow in here and ensure that the data flow APIs are enabled, then you’d go into this specific screen. Dataflow provides unified stream and batch data, processing that serverless fast and costeffective. And if you go to create job from Template, you would see that there are a wide variety of templates that are provided by Dataflow and these templates can be used to create your jobs. You can see that cloud dataflow can be used to do export and input from a variety of places. Let’s just type in BigQuery. So you can see that you can transfer data to BigQuery from a wide variety of places.

Hive Kafka, Pub sub, Cloud Storage and if you scroll down and if you scroll down, you’d also see import options if you type in Big table for example, you can see that you can export to cloud storage variety of formats. Again, if you type in Data Store, you can see that you can use Dataflow to transfer data from Data Store to Cloud Storage and Cloud Storage to Datastore. In addition to that, you can also do a lot of streaming using Dataflow if you just type in Pub sub. Pub sub is what is typically involved in data streaming use cases. You have a lot of messages coming in, you put them in a Pub sub topic and from there you can use Dataflow to take that data to anywhere you’d want to.

For example, BigQuery. Or you can go to Cloud Storage, or if you want, you can take it to another Pub sub topic, or you can send it to Splunk. If you want to send your data to Splunk and do processing around it, you can do that as well. So you can see that data flow can be used in streaming. These are continuous data which is flowing in that’s a stream. You can use data flow in streaming use cases as well as in batch use cases. And that’s the reason why I call it a very difficult service to describe. Let’s look at a few example pipelines you can build. We have looked at some of them right now. Pub sub to Dataflow to BigQuery this is a streaming pipeline, right? Dataflow is taking something from Pub sub and writing to BigQuery.

So you can have a stream of data which can be sent out to BigQuery.This is streaming files. So you have files which are coming over to Pub sub. You’re using Data Flow and writing them to cloud storage. You can also use it for batch load data into database. You can actually pick up something from Cloud Storage and write to BigTable. Cloud spanner data store BigQuery for most of the databases. In addition to that, you can also do a lot of batch activities. For example, you have a lot of files in cloud storage and you’d want to compress them. You can do a bulk compressed files in cloud storage. If you just go in here and search for compress, you’d be able to see this in here. Bulk compressed files on cloud storage and bulk decompress files on cloud storage.

If you want to convert between file formats, you would see that most of the big data tools handle multiple formats avro, Park, CSV and you can convert between these file formats using cloud dataflow. So if you’re just type in convert in here so you can see that there is a utility which helps you to convert file formats between Avro, park and CSV. As you can see, cloud dataflow is an integral part of different kinds of workflows in Google Cloud platform. It can be used for streaming and batch use cases. So things like real time fraud detection, or you have a lot of data coming in from IoT sensors or you have log data which is flowing in or you are doing some batch processing like loading data or converting formats. In all these use cases you can make use of cloud data flow.

We saw that there are a lot of pre built templates which are offered. You can choose a template, fill in the details and create the jobs that you’d want to create. Important thing to remember is cloud dataflow is based on an open source framework called Apache Beam and to process the data as it flows, you can write programs in Java, Python, Go Node and a number of other languages. Cloud dataflow is serverless and it auto scales based on the resources that you need to run the specific pipeline you are executing. Cloud data flow would automatically provision resources and take care of infrastructure. So it’s serverless and auto scaling. The idea behind this step was to give you a high level overview of data flow. I’m sure you’re having an interesting time and I’ll see you in the next.

7. Step 07 – Getting Started with Cloud Dataproc

Let’s look at cloud Dataproc. What is Cloud Dataproc and when do you use it? Cloud Dataproc is a managed spark and hadoop service. When you go for Spark and Hadoop, typical workloads are where you’d want to get some intelligence from your Data, ML or AI. There are a lot of frameworks which are present in the Spark and Hadoop ecosystem, and Cloud Dataproc supports most of them. Spark by Spark Spark rhive Spark SQL pig hadoop. All these jobs can be run using Dataproc. Typically Spark and Hadoop are used to perform complex batch processing. And when you are using Spark and Hadoop, you typically have a cluster. And with Dataproc, you can create multiple cluster modes. You can either have a single node cluster, then it’s not really a cluster, it’s a single node.

Or you can have a standard cluster, or you can have a High Availability cluster. In a High Availability cluster, you have three masters. So even if one of the masters is done, you can still continue processing. Important thing to remember is Dataproc uses virtual machines, and therefore you can choose the characteristics of the virtual machine you’d want to make use of. You can either choose to use regular virtual machines, or if your jobs are cost sensitive, and if your jobs are cost sensitive, fault tolerant, and not really urgent, then you can even use Preemptible VMs. A good use case for going for Dataproc is if you have any Spark and Hadoop clusters and you’d want to actually move them to Google Cloud.

So if you have machine learning or aid development, which is done using Hadoop and Spark and you’d want to move it to the Cloud, a tool you might want to check out is Dataproc. Remember that Cloud Dataproc is a data analysis platform. So whenever you are exporting something, you are actually exporting the cluster configuration. It’s not the data. If you look at alternatives to Dataproc in the Google Cloud platform, the comparable alternative is BigQuery. You have petabytes of Data and you’d want to do analysis on it using SQL queries, you’d go for BigQuery. However, you’d go for Dataproc. When your analysis involves more than SQL queries, you might want to build a complex application to do the intelligent analysis.

So if you want to do complex batch processing for machine learning or AI workloads, then you can go for Cloud Dataproc, which is based on Spark and Hadoop. We’re not really going to create a Dataproc cluster. Let’s just get a quick overview of it. So let’s go to cloud Dataproc. If need be, go ahead and enable the APIs for it. I’ll want to go to Cloud Dataproc, and once you go in here, you can see that you can actually create a cluster, and then you can actually set up jobs. You can set up workflows. Looks like I have not enabled the Dataproc API before, so let’s go ahead and enable that. And as you can see in here. Dataproc API manages hadoop based clusters and jobs on the Google Cloud platform. And you can see the service name for this API in here, which is Dataproc googleapis. com.

Okay, we are into Dataproc now. It lets you provision Hadoop clusters and connect to the underlying data stores. As you can see in here, you can create clusters, you can create jobs, you can create workflows. Let’s try and see what’s involved in creating a cluster. So if you go and create a cluster, you can see that there are different steps in creating the cluster. The first thing that you need to use is where do you want to create the cluster? What is the type of the cluster? Do you want the standard cluster, single node cluster, or high availability cluster? After that, you can configure auto scaling and also and also the components that you would want to install when using Hadoop. You might want to make use of a number of components.

Basic ones are automatically installed, and you have the option to choose the optional components in here. So things like zookeeper, HBase, Flink, Docker the next thing that you can choose is how should your cluster be? What should be the virtual machines which need to be used to create the master node, and what should be the virtual machines that needs to be used to create the worker nodes? You can choose the hardware configuration of both the master nodes and the worker nodes. You can choose the machine family, and you can choose what kind of hard disk you want to attach that specific type. After that, you can actually customize the cluster and you can manage the security of the cluster and click Create. I don’t really want to create a cluster right now.

I’ll just say cancel and get out of here. In this step, we looked at Cloud Dataproc. It is a managed, spark and Hadoop service in Google Cloud. It supports most of the jobs which you’d want to run as part of a Hadoop cluster. You saw that Cloud Dataproc offers you with a lot of flexibility in how you create your clusters. You can either go for a single node cluster, standard, or high availability cluster. You also get to choose what kind of hardware needs to be used to run your master nodes and your worker nodes. The important takeaway from this step is if you would want to run Hadoop jobs or spark jobs in the Google Cloud, the service you’d want to consider is Dataproc. I’ll see you in the next.

Comments
* The most recent comment are at the top

Interesting posts

Impact of AI and Machine Learning on IT Certifications: How AI is influencing IT Certification Courses and Exams

The tech world is like a never-ending game of upgrades, and IT certifications are no exception. With Artificial Intelligence (AI) and Machine Learning (ML) taking over everything these days, it’s no surprise they are shaking things up in the world of IT training. As these technologies keep evolving, they are seriously influencing IT certifications, changing… Read More »

Blockchain Technology Certifications: Exploring Certifications For Blockchain Technology And Their Relevance In Various Industries Beyond Just Cryptocurrency

Greetings! So, you’re curious about blockchain technology and wondering if diving into certifications is worth your while? Well, you’ve come to the right place! Blockchain is not just the backbone of cryptocurrency; it’s a revolutionary technology that’s making waves across various industries, from finance to healthcare and beyond. Let’s unpack the world of blockchain certifications… Read More »

Everything ENNA: Cisco’s New Network Assurance Specialist Certification

The landscape of networking is constantly evolving, driven by rapid technological advancements and growing business demands. For IT professionals, staying ahead in this dynamic environment requires an ongoing commitment to developing and refining their skills. Recognizing the critical need for specialized expertise in network assurance, Cisco has introduced the Cisco Enterprise Network Assurance (ENNA) v1.0… Read More »

Best Networking Certifications to Earn in 2024

The internet is a wondrous invention that connects us to information and entertainment at lightning speed, except when it doesn’t. Honestly, grappling with network slowdowns and untangling those troubleshooting puzzles can drive just about anyone to the brink of frustration. But what if you could become the master of your own digital destiny? Enter the… Read More »

Navigating Vendor-Neutral vs Vendor-Specific Certifications: In-depth Analysis Of The Pros And Cons, With Guidance On Choosing The Right Type For Your Career Goals

Hey, tech folks! Today, we’re slicing through the fog around a classic dilemma in the IT certification world: vendor-neutral vs vendor-specific certifications. Whether you’re a fresh-faced newbie or a seasoned geek, picking the right cert can feel like trying to choose your favorite ice cream flavor at a new parlor – exciting but kinda overwhelming.… Read More »

Achieving Your ISO Certification Made Simple

So, you’ve decided to step up your game and snag that ISO certification, huh? Good on you! Whether it’s to polish your company’s reputation, meet supplier requirements, or enhance operational efficiency, getting ISO certified is like telling the world, “Hey, we really know what we’re doing!” But, like with any worthwhile endeavor, the road to… Read More »

img