• Home
  • ISC
  • ISC-CCSP (Certified Cloud Security Professional) Dumps

Pass Your ISC-CCSP Certification Easy!

100% Real ISC-CCSP Certification Exams Questions & Answers, Accurate & Verified By IT Experts

Instant Download, Free Fast Updates, 99.6% Pass Rate.

ISC-CCSP Bundle

$69.99

ISC-CCSP Certification Bundle

Certified Cloud Security Professional (CCSP)

Includes 512 Questions & Answers

ISC-CCSP Certification Bundle gives you unlimited access to "ISC-CCSP" certification premium .vce files. However, this does not replace the need for a .vce reader. To download your .vce reader click here
ISC-CCSP Bundle
ISC-CCSP Bundle

Certified Cloud Security Professional (CCSP)

Includes 512 Questions & Answers

$69.99

ISC-CCSP Certification Bundle gives you unlimited access to "ISC-CCSP" certification premium .vce files. However, this does not replace the need for a .vce reader. To download your .vce reader click here

ISC-CCSP Certification Exams Screenshots

ISC-CCSP Product Reviews

Download Free ISC-CCSP Practice Test Questions VCE Files

Exam Title Files
Exam
CCSP
Title
Certified Cloud Security Professional (CCSP)
Files
5

ISC-CCSP Certification Exam Dumps & Practice Test Questions

Prepare with top-notch ISC-CCSP certification practice test questions and answers, vce exam dumps, study guide, video training course from ExamCollection. All ISC-CCSP certification exam dumps & practice test questions and answers are uploaded by users who have passed the exam themselves and formatted them into vce file format.

Domain 1 (Architectural Concepts & Design Requirements)

17. IAM Access Roles

So guys, in this last slide, we talk about cryptography, data in motion, security, and security for data at rest. So next is key management, and next are the common approaches for key management. That is a remote key management service or a client-side key management service. And the next one here is IAM and access control. So, better yet, I'll go for a practical session right now for all of those slides I have just explored, okay? So first for that, what I'm doing is I'm logging intothe Azure subscription that I have got so you can goto the Microsoft.com and Portal Azure.com and go for it. What I did was tell you about it. I went to the creator resource, and I've been to storage, and I have created a storage account. This one. When you click on this, you can choose any one of the file services, and there are multiple. But here, just give a name, any name, and simply click on Create. It will take around five minutes to create it. I'll help you. Again, you have to go to "Create Resources." You have to go to the storage, okay? And then select any of the following ones, but I'm using the storage account here. You can also go for a quick tutorial. But click on this, give a name to your storage, and then how it works, and click on Create. Once it is created, okay, you can see that in the dashboard. Or you can go to the all resources and 1 secondyou can go to the all services and click on Resents. Okay? So when you go there, you will find something like this. I have given the name actually 1 second hereI have given the name for my storage accountis Mucasting and it is a storage account andI access it last three minutes ago. So from here Also, I can pin it to the dashboard. So what will happen next time I simply need to go to the dashboard, and I will find my storage account over here. It will take a while, I think, to come. I'm just refreshing it. So once it is ready, you need to go to the storage.explore.com explorer.com storage.You simply need to type something like storageexplorer.com here. And with the help of that, what you will do is just enter the site after creating a storage account, and you have to download Storage Explorer here. Once you download the Storage Explorer here, it will ask for your username and password. By going here, I simply enter my username and password. Simply say "apply." And now what will happen is that under your account, you can create a file storage, a table, or multiple types of storage; databases are there if you want them. But it's simple file storage. I created one. Similarly, you can go for your upload, upload a folder, or upload a file. So you can go upload a folder, and you can upload it from your machine here. Okay? So that is the one thing you can do. But we are just learning another role here. Let me explain. Meanwhile, it takes time. This one. This IAM has access control. IAM typically looks to utilise a minimum of two, preferably three or more, factors of authentication. So two to three factors of authentication are there. The key players in IEM provisioning and deprovisioning have a centralised directory services privilege user management, authentication, and access management. So, let's talk about it. You haven't found much so far if you go here or to the dashboard. So we'll again go to "all services" and you go to "reset," and let's try and make it pinned to the dashboard. Okay, let's try to go to the dashboard and find it. Maybe it's taking long time, sobetter we will go to resends. Okay, when you open this, explore it right here itself.Now when you go, it will ask you to open in Explorer, move, delete, and refresh. So similarly, you can delete your account from here. You can open it if you move it. To open it, go to storageexplorer.com and download the software. So on the left-hand side, you will find the following controls over here.So the activity log is there when you have accessed it. It's a good security feature. I have accessed it here. Status is succeeded, and the timestamp is also there. My subscription is there and initiatedby my email ID is there. So when you go to Access Management, you will find a role assignment. Okay? So the role assignment is there. If you go to the roles, you will find there are plenty of roles some of you are aware of, like owner, who owns the data, contributors, reader, backup contributor, backup operator, Dave, test lab, userlog, analyst, contributor, and so on. So it is critical when you have a storage facility with hundreds of employees; they may not all require the same set of rules. Because there is data or storage on the Claude, you should give proper roles and assignments to the user. So go to the role assignment here and click on "Add." When you click on "Add," it will ask you what kind of role you want to give. If you click here, you will get all the roles. For example, reader, please assign access to this Claude I'm using Active Directory. So if you are aware of Windows Server 2012, 2008, 2003, 2016, or any of these, you must understand Active Directory. Okay, so I have created a user in ActiveDirectory, which I will show you later as well. But I'm selecting here. I'm using SEO in Active Directory. Choose a name or an email address. So my email is here. Sync at mughestrainer@1999.gmail.com on Microsoft.com. Done. So this is my member right now. and I want to remove it. I can discard it or save it. So this particular member, if he uses MyStorage, has only read permission. And here you can give multiple permissions, enable site recovery, grant user access, monitor administrator activity, and so on. So this is pretty simple. assigning a role. But keep in mind that you should know the Claude before applying security to it, which I won't teach you all about. But let me help you here. On the left-hand side, you find Azure Active Directory. Okay? So in Azure Active Directory, on the right, you find that there is a user. so you can create a user. This is your active directory name. Mochay Strainer 1989 Gmail on Microsoft.com. Okay, this is the name. You can go to the user and give a name like "Udemy Test." Okay. This will be the name for logging in. And the username is Udemytestim. and then, sorry, Udemytest udemy test.Add the rate Udemy is the name. Add the rate and name of your domain. There's nothing you have to change. Okay? profile and all. If you want, you can add it, but simply stating the user role is important here. What kind of Active Directory role do you want to give him as a user? You want to give him a global or a limited administrator. So you can give I'm just giving it a global and clicking "create." You have now successfully created a user. So when you successfully created a user, I'm going back to the list of recent services and I'm going again to my file IAM access control, and there I'm going to assign a role for which the role name is backup contributor and the user name is Udemy test. You will find it over there. Here. So this name is coming. So IAM is very important here, and I have successfully assigned the role. So this is the one. So when you have a storage activity log, is it there? Who has accessed it? When it is, it has been accessed. Okay. Second are the roles for a particular user. what kind of a role they have. Storage Explorer preview You can, as I have already shown you. It looks like this. From here, you can see it. Access keys are one of them. Those keys are used to gain access. You can copy and paste over there. So if you can read it, use an access key to authenticate your application. When making requests to this Azure storage account, store your key securely, for example, using Azure keywords, and don't share them. We recommend regenerating your keys regularly. You are provided two keys so that you can maintain a connection using one while regenerating the other. So there are two keys, and this is data at rest. Okay. Another one. Here is the encryption. Storage service encryption protects your data at rest as you store encrypt your data as it's written in our data centre and automatically decrypts it for you as you access it when you choose the encryption level above. So whenever you access it, decrypt it by default. Data is encrypted using these Microsoft keys forAzured Blob, table, file, and queues—you may choose to bring your own key for encryption. So you can go here and enter your own key to have encryption over encryption, which I recommend as the best method. And then go for the save. Okay. And you can go to the locks here: lockname and lock type—read only or delete. And there's one more best feature, which I'd like to explain: soft deletion. Soft deleting is similar to taking a snapshot or doing something else. If you delete it by mistake, it can be recovered easily. So when turned on, soft delete enables you to save and recover your blob data. In many cases where blobs or blob snapshots are deleted, the protection extends to blobs that are deleted as the result of over rights.Those are the ones. Apart from that, there are the monitoring,there are the alerts, somebody access it. So those are there, plus usage is there right now. So far, there are no alerts; you can manage them. One more is usage. If you look for it here, you will find it over here. It takes time. My network may be a bit slow; it's giving me errors retrieving data, but if you refresh it, it works. As a result, you also gain access to your data. Okay, so I'm again going to "all services," going to "recent," and this is my storage account if I want. There are firewalls and other virtual network protections, which I am not going to discuss at this moment in this particular slide. Now I'm going to click on "delete." So I have to delete this account. This section cannot be undone. This will permanently delete a storage account. Are you ready for it? So you have to type your name here. So I'm typing the name of my storage account, and if there's a green tick, then only you can delete it. Validation succeeds, deleting the storage account. Thank you very much, guys. Next, we will discuss it in the upcoming slides.

18. Vendor Lock-in

Hello guys. Good evening, and let's continue with this session data and Media Sanitization Vendor Lock In so that when the lock in highlights where a customer may be unable to leave, migrate, or transfer to an alternate provider due to a technical or non-technical constraint, the customer can leave, migrate, or transfer to an alternate provider. So, suppose there is a flaw in this one, and you are somewhere here when the lock-in is sometime. You are trying to access it, but according to Claude, they may fail to pay the bills for their infrastructure, not the actual bills actually.However, the cost of it, or the internet, or the failure to pay their engineers means that they are also unavailable to keep the networking and infrastructure up to date. So because of that, here you will face a problem because the provider may have a technical issue, a nontechnical issue, or a financial issue. Because of those, you may not be able to leave or migrate your services to some other club or transfer to an alternative provider because of technical or non-cryptographic erasers and data overwriting. Thank you.

19. Virtualization Security

So this is the last slide of domain one, and we have been talking about the cloud's security. So keep an eye out for this virtualization security. Virtualization technologies enable cloud computing to become a real and scalable service offering due to the saving, sharing, and allocation of resources across multiple tenants and environments. Virtualization is the look for this one physical server, OK? And there are multiple multiple user who has been accessingthis server and multiple and this is a provider. The provider may keep all the servers over here. For example, this is a company ABC; this is a company XYZ. So ABC is also accessing their virtual server from here, and XYZ Company is also accessing their services. So we make it possible; that's virtualization. So virtualization technologies enable clawed computing to become a real and scalable service offering due to the savings of one server giving services to multiple companies, organizations, or people sharing and allocating resources across multiple tenants and environments. So the next term here is hypervisor. So, without a doubt, Hypervisor is the rule for providing virtualization. So the role of the hypervisor is a simple oneto allow multiple operating system to share a single hardwarecost here second is an intelligently controlling the host processorand resources prioritise the allocation what is needed to eachOS while ensuring there are no crashes and the neighbordoes not upset each other. So that is the hypervisor do one and two. Let me explain it to you here so it is clear what a hypervisor does. So let's take an example of a server. This is a physical blade server and insidethis server now inside this server there areVMs, virtual machine one, virtual machine two, virtualmachine three, virtual machine four. So how let's take example, thisone is a physical server. So on top of the physical server we installed hypervisor asI have explained hypervisor there are two type one and two I already explained it in my last slides as well. So we install a hypervisor, and the hypervisor is responsible for creating virtual machines (VMs) and allowing specific RAM, hard disc drive, and CPU power. Ram hard drive, CPU power. So this one has a load; you can control it. So it should not affect the neighbor. Suppose it's using 100% CPU, so he can use his CPU only, not affecting the other machine, okay? So you can put the limits, you canmake an unlimited CPU and so on. So, who is specifically responsible for this is a hypervisor, specifically hypervisors one and two? So that's what we are talking about here. Hypervisors one and two have hypervisor security for hypervisors. we have already discussed about these hypervisors butsecurity so type one hypervisor significantly reduced theattack surface over type two. Because I already explained you, this is the server ontop of server there is a hypervisor on top ofhypervisor they are OS, multiple OS but in type twothere is a server on top of server there isOS, for example Windows Eight or Windows Server 2016. On top of that there is ahypervisor and then there are OS. So in that case, if there's an attack on this machine, you can definitely access everything. These OSes are just files somewhere here, hidden here.So type one is way better than type two. So, in this case, one hypervisor vendor is also in charge of relevant software that is included in and comes with the hypervisor package, such as the virtualization function, operating system functions such as device drivers, and the input and output stack. Look for the type 2 hypervisor because type 2 hypervisors are OS-based and therefore more attractive to hackers given that there are far more vulnerabilities associated with the OS as well as the other applications that reside within the OS layer. So, you know, if somebody is very good at penetration testing or hacking, he can definitely find the weak points in Windows 8 and straight away get in. Once he get the control of the basemachine, it's very easy for him to accessthe hypervisor and access each operating system. So security in hypervisor one is unquestionably superior to security in hypervisor two. What are the common threads? Common threads are: I'm just explaining, I'm just reading them, and they go through each detail one by one detail.If you have not done ethical hacking or some sort of cybersecurity course, just browse them. The common thread is the data breach. Data loss. accounts of service traffic. Hijacking insecure interface andAPIs denial of services. Malicious insider which is people processing technologyinside people are more threat to ifthey have a good knowledge and theycan hack easily abuse of Claude service. Insufficient due diligence. Shared technology vulnerabilities the securityconsiderations for different claudia. The first one is infrastructure as a service. Here, what is the security consideration? VM attack, virtual network, hypervisor attack, VM-based rootkits, virtual switch attack, DOS attack, colocation, and multi-tenancy platform Azure security is system and resource isolation, user-level permissions that we have already discussed, how I created a user and how I give him the permissions, and user access management, which has been explained earlier. Software-as-a-Service security is data segregation, data access and policy, and web application security. Maybe in the upcoming slideshow we will go through the practical. Thank you very much.

Domain 2 (Cloud Data Security)

1. Data Discovery Method

Hey guys, good morning. My name is Mukish Sang. And welcome to the CCSP. That's a Clot security domain two. Domain two is about Claudius data security. So what we are going to talk about in this goal is the type of control needed to administer various levels of ability, integrity, and confidentiality to secure data in the cloud. So definitely, there are three things Of course, we also called it CIA Triatis. Okay, c stands for confidentiality. I stands for integrity of data, and A is the availability of your data. So these are the three major factors foryour data to make it confidential, integrity ofit and it should be always available. So this is the Claude data security domain two. Key role for the data hereis data owner and data custodian. So there are basically two roles here. Let's see, this is a cloud service provider. It's called a data custodian. Data and data owner is a data owner. So it's also important it's work a two way likedata owner and data custodian like they have their boundarieslike the owner needs to be made sure with someboundaries to make it secure and also data custody andhave their own to make it data secure that wewill discuss in upcoming videos. But this also tried to make it secure from his platform as well as on the clock. So we will discuss that and how it is going to take place. But the security of a data set depends on these two. That is a provider and secondone is owner and custardian. So that's also data here. If you look here, data security is not only theresponsibility of a data owner but the data custodian. Custardians are the one who areworking actively in organization, network, engineeror administrator, enterprise architecture, CTOs ChiefTechnical officer and CSO and CISO. Label-based discovery, metadata-based discovery, content-based discovery, data mining, real-time analytics, and agile business intelligence are the top data discovery methods for discovering your data. So label based discovery is while creating a data. The first is label basis when creating a data set. Okay, you're making a data DVD, for example, and you mentioned that it's confidential, big data, and HR department-related. So creating the data, giving it a label, and finding data on the label is label-based discovery. Let us try to explain it more clearly. The label created by data owner ina create phase of data lifecycle. So whenever you are creating a data set—that's called the "create phase of the data lifecycle"—with an accurate and sufficient label, the organisation can readily determine what data it controls and what amount of each kind there is. This is another reason why the habitual process of labelling is so important. So habit is very important for labeling, like whenever you create a data whenever you create a data.For example, labelling HR on the date 2018 September Finance Department something departments, or perhaps it's a root data structure, and then there's another directory over here, like directory ABC and directory XYZ. It could be either 2017 or 2018. It could be January, February, or March in 2018, or it could be audio or video, and so on. So data should be labelled when we are creating or in the process of creating so that it is easily accessible. Labels can be especially useful when a discovery effort is undertaken in response to a mandate with a specific purpose, such as a court order or a regulatory demand. If all data related to X is required and all such data is readily labeled, it is easy to collect and disclose all the appropriate data. The only appropriate data, so that's the one labeled, can also be like data for user C, data for user D, data for user F, and so on. The second one is a meta-database discovery. So metadata, let me explain over here, okay? Metadata in this case, for example, you havea big data like data hadoop and allhuge database, okay, trillions of data inside it. For example, we use a very big book to read, and books contain about 2000 pages. And if you want to go to a specific topic, how do we start looking at all the 2000 pages? Now, what we do is have a first page; a second page is an index. We look for the index, chapter number one, chapter number two, chapter three, chapter four, and chapter four, topics A, B, and C, topic C, page number 291. So with that, we go directly, okay? So metadata discovery is nothing but having some sortof index so we can easily access in additionto the label, metadata is useful for discovery purpose. Colloquially referred to as "data about datang. And welcome to the CCSP. That's a Clot security domain two. Domain two is about Claudius data security. So what we are going to talk about in this goal is the type of control needed to administer various levels of ability, integrity, and confidentiality to secure data in the cloud. So definitely, there are three things. Of course, we also called it CIA Triatis. Okay, c stands for confidentiality. I stands for integrity of data, and A is the availability of your data. So these are the three major factors foryour data to make it confidential, integrity ofit and it should be always available. So this is the Claude data security domain two. Key role for the data hereis data owner and data custodian. So there are basically two roles here. Let's see, this is a cloud service provider. It's called a data custodian. Data and data owner is a data owner. So it's also important it's work a two way likedata owner and data custodian like they have their boundarieslike the owner needs to be made sure with someboundaries to make it secure and also data custody andhave their own to make it data secure that wewill discuss in upcoming videos. But this also tried to make it secure from his platform as well as on the clock. So we will discuss that and how it is going to take place. But the security of a data set depends on these two. That is a provider and secondone is owner and custardian. So that's also data here. If you look here, data security is not only theresponsibility of a data owner but the data custodian. Custardians are the one who areworking actively in organization, network, engineeror administrator, enterprise architecture, CTOs ChiefTechnical officer and CSO and CISO. So, the data discovery method for discovering your data in number one is label-based discovery, metadata-based discovery, content-based discovery, data mining, real-time analytics, and agile business intelligence. So label based discovery is while creating a data. Number one is label basis while creating a data set. Okay, you are creating a DVD with data, for example, and you mentioned that that is confidential, that is big data, and that is HR department. So creating the data, giving it a label, and finding data on the label is label-based discovery. Let's try to elaborate on it more obviously. The label created by data owner ina create phase of data lifecycle. So whenever you are creating a data set—that's called the "create phase of the data lifecycle"—with an accurate and sufficient label, the organisation can readily determine what data it controls and what amount of each kind there is. This is another reason why the habitual process of labelling is so important. So habit is very important for labeling, like whenever you create a data set. For example, labelling HR on the date 2018 of September Finance Department something departments, or maybe it could be a root data structure, and then there's another directory over here, like directory ABC and directory XYZ. Or it could be 2017, it could be 2018. In 2018, it could be like maybe January, February, or March, or it could be maybe audio or video, and so on. So data should be labelled when we are creating, when we are in the face of creating, so it should be easily accessible. Labels can be especially useful when a discovery effort is undertaken in response to a mandate with a specific purpose, such as a court order or a regulatory demand. If all data related to X is required and all such data is readily labeled, it is easy to collect and disclose all the appropriate data. The only appropriate data, so that's the one labeled, can also be like data for user C, data for user D, data for user F, and so on. The second one is a meta-database discovery. So metadata, let me explain over here, okay? Metadata in this case, for example, you havea big data like data hadoop and allhuge database, okay, trillions of data inside it. For example, we use a very big book to read, and books contain about 2000 pages. And if you want to go to a specific topic, how do we start looking at all the 2000 pages? Now, what we do is have a first page; a second page is an index. We look for the index, chapter number one, chapter number two, chapter three, chapter four, and chapter four, topics A, B, and C, topic C, page number 291. So with that, we go directly, okay? So metadata discovery is nothing but having some sortof index so we can easily access in additionto the label, metadata is useful for discovery purpose. Colloquially referred to as "data about data," metadata is a listing of traits and characteristics about specific key data. elements of a set. Metadata is frequently generated automatically at the same time as data, often by hardware or software, and so on. Let's have an example here. For instance, most modern digital cameras create a vast amount of metadata every time a photograph is taken, such as date, time, and location, where the photo was taken, the model of the camera, and so on. So metadata, I'll give you one more example. In a big way, you are searching location-wise, okay? So, New York So whatever photo I have taken in New York will appear. So, if you search for data with the New York location, or photos taken with an iPhone X or iPhone, they will appear here, as will photos from 2018. So on location, on make, on model, or on a particular type, if you are searching for that which comes under this data, discovery can therefore use metadata in the same way a label might be used. So, for example, a label and its same use specific C field of metadata may be scanned for a specific term, and so on. Next is content-based discovery. So content-based discovery is possible even without labels or metadata. The discovery tool can be used to locate and identify specific types of data by looking into the content of data sets. This method can be as simple as term searches or as complex as pattern matching technology. So if you are searching for a specific name, you can say, "Okay, it should match exactly 100%," or when you are making a search for C-A-T, it should be C. Start with C so all the results will appear, based on your content. If you are searching, that is considered a content-based discovery. The next one here, guys, is data analytics Okay, we have been learning here, so currently technological options provide additional options for finding and typing data. In many cases, these modern tools create new data feeds from sets of data already existing within the environment. These also include data mining. Data mining is the term for the family of activities that the other option on the list derives from. This kind of data analysis is an odd growth of possibility offered by regular use of the cloud, also known as "big data." When the organisation has collected various data streams and can run queries across these various feeds, the organisation can detect and analyse previously unknown trends and patterns that can be extremely useful. As a result, big data loops can be useful tools for data mining. Real time analysis analytics in some casestoo can provide data mining functionality concurrentlywith the data creation and use. These tools rely on automation and require efficiency to perform properly and enable agile business intelligence. State-of-the-art data mining involves a recursive interactive tool and process that can detect trends in trends and identity. even more oblique pattern in historical and recent data. So this is if you want to go in and just read the word "big data," "Hadoop," or "data mining." There are various platforms and tools available. OK, so this is the cloud lifecycle we will discuss in the upcoming slide. Thank you.

2. Cloud Data Life Cycle

Hey guys, let's talk about the stages of the data lifecycle in this domain. Okay. So first of all, the stages are here: creating the data, using it, storing the data, and then sharing it with other people using this platform. Then archive your data, and then destroy your data. So the following are the stages of your data lifecycle. So start with the creating by using store, share, archive, and destroy. So let's learn them in detail. Here is where you can create on Claude or remotely. As a result, it appears that you are also creating data. You can now create data on a Claude-Claude service provider. Here you can create your data, upload it, or use it using this database. The data is already there. Okay? So now you are accessing it and manipulating it or modifying it. You are editing it or making some changes or checking some records. Okay, storing is also we have store here on thecloud or storing it on the locally as well. So storing your database in short-term data storage makes it available for collaboration. So I'm sharing it with userabc and userXYZ, or email ID, abc@abc.com, and so on. Long-term data storage should be archived and destroyed at the end of the lifecycle. So how do you destroy your data over the Claude or on your drives? So let's talk about it. Create face. Let's start with the one create face. Data will most often be created by accessing the clock remotely. So how do I create it? We will go with a practical way as well inupcoming videos, depending on the use case, the data mightbe created locally by user at their remote workstation. and then upload it to the Claude data centre via remote data manipulation. So it's saying nothing much, but here you are, okay. And you are working on your workstation, and your data is over here. Now you want to use the data you have created locally on your workstation. Now you are connected with the ISP (internet service provider), and ISB is connected with the cloud service provider. Later on, using some sort of application or remotely, you can upload your data over here. So later on we will discuss like we know the lastwe already talked about data at rest and data in motion. So how do I upload it safely and securely? So using some sort of VPN tunnel and networking tunnelso that we will talk but he's talking create phase. We basically create the data on our local workstation and then, with the help of some applications or remote plugins, we upload it over there. So data created remotely by the user should be encrypted before uploading to the Claude. This will protect against a man-in-the-middle attack or a user inside the clock. So, suppose this is your workstation, and you're using it here, and you're connected to the Internet, and the Internet is later connected to the clock, and this is your central data storage. So it's saying that when you are trying to upload the data, make sure it is encrypted. So what will happen is thisis the Internet service provider. There are attackers who will try to gain access. If they gain access, at least they should not be able to decrypt it. That's the benefit of encryption. If someone is already in the Claude and tries to access your data, you can easily get it, but if your data is encrypted, he may not be able to decrypt it. So it is very important data createdremotely data created by users should beencrypted before uploading to the Claude. It will protect against MITF attacks by users inside the clock. The crypto system used for this purpose should have a high work factor and be listed on the FAPF. This is saying you should use some sort of standard crypto system that should not be able to decrypt easily. Apart from that, we also should implement a good key management system. So, key management systems will be discussed as a practise that must be implemented in order for them to be decrypted by a third party. The connection used to upload the data should also be secure, preferably with an IPsec VPN solution. Okay, so what is that? Let's talk about this IPsec VPN solution. Here we are. This is the user; he has a connection to an Internet service provider. There are multiple Internet service providers, you could say, in between, and then somewhere there is a clot service provider, Claude. Our data is here; our storage is here. Okay, so sending uploading user create datahere like A-B-C or whatever your datais here is using an application. That application will help to upload the data over here, okay, in the create phase. So what is saying that, rather than our data? if your data is in the normal text. If you are sending your data to AAPC, What will happen is that if an attacker tries to attack because IP is a public network and there are thousands of millions of users connected, they can easily get access to your data, and data access by the hacker will not be something good for either the data owner or the data custodian. So what happens is what we will dois we create a secure tunnel VPN orIPsec tunnel from source to the destination here. So what will happen here? There's a safe tunnel source tunnel source is this. Your data will go by this ABC, and your data will reach here. Data is also encrypted plus VPN tunnelis there and data is encrypted. So what happened is that the attacker we are transferring—this is not a physical tunnel, this is an alogical tunnel—went through the same way. But attackers may not be able to attack inside the tunnel. Rather, in the event that they do. Your data is also encrypted, so there is double security. So that's what we are talking about here. A connection should be uploaded, the data should alsobe secured preferably with the IPsec VPN solution. Note: all the data needs the same attention. So at first we need to do labelling before making the security decision to label the data, and then look forward to security. The second is the store phase. The store phase is closely related to the creation phase. So it requires similar kinds of attention, especially short-term data storage. So the store phase is like, "Okay, this is here, and you are trying to store the data." So it's both created and also uploading.So it is the same as having a VPN and IPsec tunnel and storing them over here. So that's okay. The third is data in use. Data is in use when you need the data, when you need the record to check it, or when you need it to manipulate it, alter it, delete it, or whatever. So after data storage, we need that phase for alternation or manipulation of the data. Definitely store data; it requires proper control before being made usable by the ACL. So what is that ACL? Okay, let's talk about that too. For example, I'm just trying to draw a picture here. This is the server, or this server is in the cloud, and there are the users, OK? And they are connected to their ISP, and they are all using their gadgets, laptops, or cell phones. Okay? So there's an ACLover here, which stands for access control list. access control list. Access Control List: There are many places where this data can be accessible or who can access it. User one, for example, cannot access the data, whereas users two and three can, or user two can only read it while user three can read and write it. So access control is something important. Your data is there—definitely, your data does have a key management system. Those people who are having a password, whocan access, plus you can use some sortof access control, who can and who cannot. You can add only few users who can accessand you can say the particular permission as well. You can give it to them. So here we are. platform through which the data is accessible; you also need the security example workstation and BYOD platform here; look at this platform. This user is accessing data. So this device also needs a security because if thisdevice is not secure, he can gain access of hisdevice and he can access the data as well. So security for the device BYOD is bring your own devices." Mostly, companies allow people to bring their cell phones and laptops so they can work on their personal machines as well. So these also need a securitypassword protected or antivirus or updated. So that's the one that needs to be ensured, and users need proper training before making the data available, so users should know the training, and you should tell the user not to save their data locally, do not write their passwords, or use their gadgets in that way, or use the VPN connection first and then connect to the data, so classification and knowing control are the basic things, and then DLP. VPN. DRM and some of the control we can achieve from the proper provider side should be present, so keep that in mind when data is in use now. Global collaboration is a powerful capability Global collaboration is a powerful capability afforded by the club, but it comes with risk. If the user can be anywhere on the planet, so can the threat, so that's what it is talking about, like this clot service provider Your data is here; it is in the eastern United States; you can access it via the internet from Europe, Asia, Australia, and many other locations, okay? Or Middle East or girlfriend so on sothe thing is your data is accessible. It's advantage like you are in Europe you can access. You travel to Asia. You can still work, but the problem is that threats are everywhere, so threats can come to you from all the locations, so the plus point is that they are accessible, but again, you have to be careful. You have to provide the security. That's what he's talking about. global collaboration is powerful capability affordedby the Claude it comes with risk ifuser can be anywhere on the planet. So many of the controls implemented in the prior phase will be useful here encryption and DRAM so encrypted Your data is the first thing, and second is accessing it using the VPN and IPsec. We may need to limit or prevent data from being sent to a certain location in accordance with regulatory mandates, for example international traffic in armed regulation. united States Department prohibitation on defence related export can includecryptography system so cryptography system is there plus it isalso possible that you can make your data accessible fromcertain locations only so this is about data news archivephase is this phase for a long term storage socryptography will as with most data related control be anessential consideration so cryptography should be there some sort ofa security there are basically two symmetric or asymmetric orsome sort of a codes the physical security of datain long term storage is also important so you arestoring data for example you are storing data locally sophysical security to be there like your doors to belocked or your hard drive you are keeping the safeor if it is in the claw as well soit is guarded 24 by seven which is CCTV orsecurity guard and so on so logical security of datais important so as the physical security of the datain long term storage is important following factor we needto look before archiving the data number one. Location: Where is data being stored? If you are storing, make sure thisplace natural disaster and climate, et cetera. So for example, if it is Japan or TMI or many things have been hit in that area, look at the building where the data centre is built. Is it proof of the natural disaster? Is the climate not extremely cold or in an odd place? Or do they have proper setups to protect from the natural disaster? Second is the format: is the data being stored on some physical medium like a tape backup or magnetic storage? Is the media highly portable and in need of additional security control to prevent theft? So it's a hard drive or DVDs; what kind of format is your data in? And third is staff. Are the personnel at the storage location employed by our organization? If not, does the contractor implement a personal control suit sufficient for our purposes? Background check. Reliance check. Monitoring and so on, so if you keep your data somewhere on the clock, you, the clock service provider, or a third party protect it. But who makes sure they have some sort of policy and also a background check on the employees of the guards or the people who have been monitoring their procedure? how data is recovered when needed. Whether a full or incremental backup Etcetera. So, if you need a backup, what are the procedures? Do you need to back up a specific file or a specific date, or do you need an entire tape backup? So that is the archive phase. The last one that came here as a destroy phase and cryptographic eraser, also known as crypto shredding, is the only physical and throughout main currently available in the cloud environment for the purpose of destroying your data over the cloud. If your data purpose is full and you want to erase it, use this system to erase your data, and that's what the global data cycle is. Thank you.

ExamCollection provides the complete prep materials in vce files format which include ISC-CCSP certification exam dumps, practice test questions and answers, video training course and study guide which help the exam candidates to pass the exams quickly. Fast updates to ISC-CCSP certification exam dumps, practice test questions and accurate answers vce verified by industry experts are taken from the latest pool of questions.

Read More


Add Comment

Feel Free to Post Your Comments About EamCollection VCE Files which Include ISC-CCSP Certification Exam Dumps, Practice Test Questions & Answers.

SPECIAL OFFER: GET 10% OFF

ExamCollection Premium

ExamCollection Premium Files

Pass your Exam with ExamCollection's PREMIUM files!

  • ExamCollection Certified Safe Files
  • Guaranteed to have ACTUAL Exam Questions
  • Up-to-Date Exam Study Material - Verified by Experts
  • Instant Downloads
Enter Your Email Address to Receive Your 10% Off Discount Code
A Confirmation Link will be sent to this email address to verify your login
We value your privacy. We will not rent or sell your email address

SPECIAL OFFER: GET 10% OFF

Use Discount Code:

MIN10OFF

A confirmation link was sent to your e-mail.
Please check your mailbox for a message from support@examcollection.com and follow the directions.

Next

Download Free Demo of VCE Exam Simulator

Experience Avanset VCE Exam Simulator for yourself.

Simply submit your e-mail address below to get started with our interactive software demo of your free trial.

Free Demo Limits: In the demo version you will be able to access only first 5 questions from exam.