Amazon  AWS Certified Cloud Practitioner CLF-C02 Exam Dumps & Practice Test Questions

Question 1:

A business is preparing to move a large volume of files into AWS using an Amazon Snowball Edge device. 

Which of the following activities does not incur a cost as part of this data transfer process?

A. Using the Snowball Edge device for up to 10 days
B. Downloading data from Amazon S3 to the Snowball Edge
C. Uploading data from the Snowball Edge to Amazon S3
D. Continuing to use the Snowball Edge after the 10-day limit

Correct Answer:  A

Explanation:

Amazon Snowball Edge is part of AWS’s Snow Family—a set of physical devices used to transfer large data sets to and from AWS securely and efficiently. It helps organizations bypass potentially slow and costly internet transfers by allowing them to load data locally and ship it directly to AWS. Although Snowball Edge offers several capabilities, it comes with defined pricing rules that organizations should be aware of.

When a Snowball Edge device is ordered, the first 10 days of usage from the date of shipment arrival are provided at no additional cost. This means customers can take their time loading data onto the device without worrying about daily charges—provided it’s done within the 10-day period. Therefore, Option A is correct.

Beyond this 10-day window, daily charges apply, making Option D a cost-incurring activity.

Transferring data from Amazon S3 to the Snowball Edge (Option B) involves outbound data movement, which generally incurs transfer fees. Similarly, uploading data into Amazon S3 from Snowball Edge (Option C) may include charges depending on the service usage tier, such as S3 PUT requests or storage costs.

Thus, while the Snowball Edge service helps save bandwidth and time, only the first 10 days of physical device usage are cost-free. Any additional data transfer activities or extended use of the device are subject to AWS’s billing policies.

Choosing Option A correctly identifies the only no-cost activity related to using a Snowball Edge device.

Question 2:

A company has deployed several applications on Amazon EC2 and wants to detect potential vulnerabilities in those applications while ensuring that its infrastructure aligns with AWS security best practices. 

Which AWS service should the company use?

A. AWS Trusted Advisor
B. Amazon Inspector
C. AWS Config
D. Amazon GuardDuty

Correct Answer: B

Explanation:

When securing applications running on Amazon EC2 instances, it’s essential to regularly scan for vulnerabilities and ensure compliance with best practices. Among AWS’s suite of security tools, Amazon Inspector is specifically designed for automated vulnerability management and assessment of workloads on AWS.

Amazon Inspector performs real-time scanning of EC2 instances and container images in Amazon Elastic Container Registry (ECR) to identify known vulnerabilities (CVEs) and deviations from security best practices. It also continuously monitors for any software updates or security patches needed and prioritizes findings based on severity levels. This makes it highly effective for companies looking to detect weaknesses within their application infrastructure.

Option A: AWS Trusted Advisor, while valuable, is a broader tool that provides high-level recommendations across categories like cost optimization and performance. Although it offers some security checks, it doesn’t offer deep application-level vulnerability scanning.

Option C: AWS Config tracks configuration changes and evaluates compliance with predefined rules, but it does not identify vulnerabilities in EC2 instances or applications. It's best used for governance and auditing.

Option D: Amazon GuardDuty specializes in threat detection by analyzing AWS CloudTrail, VPC flow logs, and DNS logs to uncover suspicious behavior or potential account compromise. However, it doesn’t scan applications for vulnerabilities.

In summary, Amazon Inspector is purpose-built for detecting application vulnerabilities in EC2 instances. It gives detailed and prioritized insights to help businesses improve their security posture effectively. Therefore, Option B is the correct choice for assessing application vulnerabilities and ensuring best practice compliance in AWS environments.

Question 3:

A company has a group of users who frequently handle large files and have surpassed their on-premises storage capacity. They want to extend their storage using AWS while still benefiting from low-latency local access.

Which AWS solution provides the most operational efficiency in this case?

A. Set up individual Amazon S3 buckets for each user and mount them using an S3 mounting utility
B. Implement AWS Storage Gateway using a file gateway and connect user workstations to it
C. Migrate user environments to Amazon WorkSpaces and assign Amazon WorkDocs accounts
D. Use an Amazon EC2 instance with an attached Provisioned IOPS EBS volume and share it among users

Correct Answer: B

Explanation:

In this scenario, the company is seeking to augment its local file storage using the AWS Cloud, while maintaining the speed and convenience of accessing files locally. AWS Storage Gateway configured as a file gateway is the most efficient and scalable approach for achieving this hybrid storage setup.

The file gateway configuration allows files stored in Amazon S3 to be accessed via standard file protocols like SMB or NFS. It includes local caching, ensuring frequently accessed files are stored on-premises for low-latency access, while less-used files reside in S3 for cost-effective scalability.

Option A suggests mounting S3 buckets directly using third-party tools. While technically feasible, this lacks native performance optimization and caching, often resulting in slower access and complicated setup. Moreover, Amazon S3 is object storage, not a traditional file system.

Option C involves moving users to Amazon WorkSpaces and setting up WorkDocs, which would be overkill. This solution requires a complete shift in user workflow and computing environments, adding complexity and cost unnecessarily when only storage expansion is needed.

Option D proposes using an EC2 instance with a shared EBS volume, which is impractical for multi-user access. EBS volumes are not designed to be shared natively across users or workstations, and managing file sharing from a centralized EC2 host would lead to inefficiency and possible access bottlenecks.

In summary, AWS Storage Gateway file gateway is purpose-built for hybrid use cases like this. It extends on-premises storage to the cloud seamlessly, with operational efficiency, security, and performance in mind—making it the ideal solution for the company's needs.

Question 4:

What is the most secure way to allow an EC2 instance to upload data to an Amazon S3 bucket according to AWS security best practices?

A. Embed IAM credentials directly into the application code
B. Save the IAM keys in a text file on the EC2 instance and read them during runtime
C. Assign an IAM role to the EC2 instance to grant S3 upload permissions
D. Modify the S3 bucket policy to allow unrestricted access for any AWS service

Correct Answer: C

Explanation:

AWS emphasizes the principle of least privilege and avoiding hardcoded credentials as a critical security best practice. The most secure method to allow an EC2 instance to interact with an S3 bucket is to assign an IAM role to the instance. This allows it to obtain temporary credentials and permissions without ever storing or exposing sensitive information.

With Option C, the IAM role is attached to the EC2 instance and configured with a policy that grants permission to upload files to S3. AWS automatically provides the temporary credentials to the instance using the Instance Metadata Service. This approach eliminates the risks of leaked or misused credentials.

Option A involves embedding access keys directly into application code, which is highly discouraged. Such keys can be easily exposed through code repositories or logs, leading to potential security breaches.

Option B suggests storing credentials in plain text on the server, which is equally insecure. If the EC2 instance is compromised, an attacker can easily extract those credentials and gain unauthorized access to AWS services.

Option D proposes a bucket policy that allows any service unrestricted access. This is extremely risky and violates security best practices. Broad permissions increase the attack surface and could allow unintended access from malicious or misconfigured services.

By contrast, IAM roles provide temporary credentials, rotate automatically, and reduce the risk of long-term credential exposure. They also make it easier to manage access at scale, especially in environments where multiple instances may need controlled permissions to access AWS resources.

Therefore, using IAM roles is the most secure, scalable, and manageable way for EC2 instances to interact with S3, fully aligning with AWS’s security recommendations.

Question 5:

Under the AWS Shared Responsibility Model, which task falls under the customer’s responsibility when using Amazon DynamoDB?

A. Protecting the physical hardware running DynamoDB
B. Applying patches to the DynamoDB service
C. Managing access to the DynamoDB tables
D. Encrypting data stored in DynamoDB at rest

Correct Answer: C

Explanation:

The AWS Shared Responsibility Model clearly divides security and compliance tasks between AWS and its customers. While AWS is responsible for securing the underlying infrastructure and managed services like DynamoDB, the customer is responsible for access control and data governance within those services.

Option C is correct because it’s the customer’s duty to manage who can access their DynamoDB tables and what actions they can perform. This includes setting up appropriate IAM policies, roles, and resource-based permissions to ensure only authorized users or services can interact with the database. Misconfigured access policies could lead to unauthorized data exposure or modification, so this task is critical to maintaining application and data security.

Option A involves physical security, such as protecting data centers and server hardware. This is entirely AWS’s responsibility. Customers do not have access to the physical infrastructure.

Option B refers to patching and updating the database software. Since DynamoDB is a fully managed service, AWS handles all maintenance tasks, including security updates and performance optimizations.

Option D relates to encryption of data at rest. While AWS enables encryption by default using AWS Key Management Service (KMS), customers can choose to manage their own encryption keys or accept the default AWS-managed keys. However, the implementation and enforcement of access to encryption keys also fall under the customer’s responsibility.

To summarize, although AWS manages the infrastructure and security of the platform, customers are accountable for managing data access and usage within DynamoDB, making access control their responsibility in this shared model.

Question 6:

Which of the following is one of the foundational perspectives included in the AWS Cloud Adoption Framework (AWS CAF)?

A. Sustainability
B. Performance efficiency
C. Governance
D. Reliability

Correct Answer: C

Explanation:

The AWS Cloud Adoption Framework (AWS CAF) provides a structured approach for organizations planning to adopt cloud technologies, guiding them through essential aspects of cloud transformation. It consists of six well-defined perspectives: Business, People, Governance, Platform, Security, and Operations. Each of these perspectives encompasses specific capabilities that organizations must develop to effectively transition to the AWS Cloud.

Among these, the Governance perspective focuses on establishing policies, procedures, and controls that ensure an organization’s cloud operations align with internal business goals and external compliance requirements. It emphasizes risk management, cost control, compliance auditing, and decision-making processes, all of which are foundational to secure and efficient cloud adoption.

Let’s analyze the other options to understand why they are incorrect:

  • A. Sustainability: While AWS places growing importance on environmental sustainability and energy-efficient architectures, Sustainability is not one of the core six CAF perspectives. It is more of an operational best practice than a foundational framework element.

  • B. Performance Efficiency: This concept belongs to the AWS Well-Architected Framework, not the AWS CAF. It emphasizes using resources efficiently but does not represent a strategic planning category within the CAF.

  • D. Reliability: Like performance efficiency, Reliability is also part of the Well-Architected Framework. It deals with system uptime and failure recovery but is not considered one of the CAF’s foundational perspectives.

In summary, Governance is one of the six primary perspectives of the AWS CAF and is essential for enforcing standards and ensuring accountability during cloud adoption. It is the correct answer because it directly supports long-term success by integrating compliance, financial management, and operational oversight into the cloud strategy.

Question 7:

A business is currently running Docker containers on Amazon EC2 and wants to simplify cluster management, scheduling, and maintenance. Which AWS service offers a suitable alternative?

A. AWS Lambda
B. Amazon RDS
C. AWS Fargate
D. Amazon Athena

Correct Answer: C

Explanation:

Running containerized workloads on Amazon EC2 gives companies control but also requires them to manage the underlying infrastructure. Tasks like provisioning virtual machines, scaling clusters, scheduling containers, and performing maintenance can become time-consuming and complex. AWS offers Fargate as a fully managed alternative that addresses these challenges.

AWS Fargate is a serverless compute engine for containers. It works with Amazon ECS and Amazon EKS to allow users to run containers without managing servers or clusters. This means there is no need to worry about EC2 provisioning, patching, or scaling. With Fargate, each container gets the exact compute resources it needs, leading to better resource efficiency and cost savings.

Now, let’s consider the incorrect options:

  • A. AWS Lambda: This is a serverless platform, but it is optimized for short-lived functions triggered by events, not for running full containerized applications or managing Docker environments.

  • B. Amazon RDS: This service is used to manage relational databases. It is unrelated to container orchestration or Docker, making it unsuitable for the scenario.

  • D. Amazon Athena: Athena is a query service used for analyzing data in Amazon S3 using SQL. It does not provide container management or scheduling capabilities.

By switching to AWS Fargate, the company can offload the operational burden of managing EC2 instances and instead focus on developing and deploying their applications. This improves scalability, reduces administrative overhead, and provides an elastic, efficient container management solution.

Question 8:

A business is deploying a NoSQL database on Amazon EC2. In this case, which responsibility remains with AWS under the shared responsibility model?

A. Updating the guest operating system
B. Managing database high availability
C. Patching the physical infrastructure
D. Configuring the instance firewall rules

Correct Answer: C

Explanation:

When deploying workloads on Amazon EC2, the AWS Shared Responsibility Model applies. This model outlines the division of responsibility between AWS and the customer. AWS is responsible for securing the underlying cloud infrastructure, while customers are accountable for the configuration and management of everything they run on top of that infrastructure.

In this scenario, AWS is responsible for patching and maintaining the physical infrastructure, which includes the servers, storage, networking, and data centers that host EC2 instances. This ensures that hardware-level security, physical access, and foundational networking are handled by AWS, maintaining uptime and compliance with industry standards.

Let’s review the other options:

  • A. Updating the guest operating system: This is entirely the customer’s responsibility. Once an EC2 instance is launched, the user must apply OS patches, security updates, and software configurations.

  • B. Managing database high availability: When using EC2, customers are expected to architect their application for high availability. This includes setting up replication, clustering, and backup strategies for the NoSQL database.

  • D. Configuring the instance firewall rules: Customers are responsible for configuring security groups and network ACLs associated with their EC2 instances. This includes controlling which IPs and ports are accessible.

In conclusion, while AWS provides and maintains the foundational infrastructure, the customer is responsible for managing everything above the hypervisor. Therefore, patching the physical infrastructure that hosts EC2 instances is AWS’s responsibility, making C the correct answer. This separation of duties helps maintain security and operational clarity in cloud environments.

Question 9:

Which AWS tools are most effective for identifying whether your EC2 instances are appropriately sized for cost and performance efficiency? (Select two.)

A. AWS Cost Explorer
B. AWS Billing Conductor
C. Amazon CodeGuru
D. Amazon SageMaker
E. AWS Compute Optimizer

Answer: A, E

Explanation:

To optimize EC2 instance costs and performance, it’s essential to evaluate how well the chosen instance types match the actual workload. AWS provides specific tools that help organizations assess whether they are over-provisioning or underutilizing resources—this is known as rightsizing.

AWS Cost Explorer is one such tool that helps you visualize and analyze your spending trends. It includes a feature for rightsizing recommendations that examines usage data and identifies EC2 instances that are either underused or oversized. Cost Explorer allows users to spot patterns in usage, making it easier to decide which instances could be replaced with smaller, more cost-effective options or terminated entirely if no longer needed.

AWS Compute Optimizer is another tool built specifically for rightsizing EC2 instances. It leverages machine learning to analyze historical usage metrics like CPU, memory, and disk I/O. Based on this analysis, it offers suggestions on the best instance type for your workload. These recommendations help you maintain optimal performance while reducing unnecessary spending.

The other options do not support EC2 rightsizing:

  • AWS Billing Conductor focuses on customizing billing and cost sharing among business units but doesn't evaluate resource usage or make optimization suggestions.

  • Amazon CodeGuru helps developers improve code quality and performance but does not evaluate infrastructure or instance sizing.

  • Amazon SageMaker is tailored for building and deploying machine learning models and is unrelated to EC2 instance management.

In summary, AWS Cost Explorer and AWS Compute Optimizer are the two services designed to help users discover EC2 rightsizing opportunities. They support better cloud cost governance and ensure you’re getting the best performance for your investment.

Question 10:

Which two advantages does AWS Trusted Advisor provide to help optimize an AWS environment? (Select two.)

A. Offers high-performance container orchestration
B. Manages encryption key creation and rotation
C. Identifies underused resources to reduce spending
D. Enhances security by offering proactive monitoring
E. Enforces tagging rules across all AWS assets

Answer:  C, D

Explanation:

AWS Trusted Advisor is a comprehensive tool that provides best-practice guidance in five key categories: cost optimization, security, fault tolerance, performance, and service limits. It helps AWS users enhance their environment by highlighting potential areas for improvement.

One of its major benefits is cost optimization. Trusted Advisor analyzes your resource usage and identifies underutilized services like EC2 instances, EBS volumes, and RDS databases. When it finds resources that aren’t being fully utilized, it recommends actions like downsizing or termination. This guidance helps organizations avoid waste and only pay for what they actually use.

Trusted Advisor is also highly effective in proactive security monitoring. It runs automated checks to spot common security misconfigurations, such as overly permissive access policies, unsecured S3 buckets, or disabled MFA on root accounts. These recommendations help you close potential security gaps before they can be exploited.

The remaining options are unrelated to Trusted Advisor’s core capabilities:

  • High-performance container orchestration is managed by services like Amazon ECS or EKS, not Trusted Advisor.

  • Encryption key creation and rotation are handled by AWS Key Management Service (KMS), not Trusted Advisor.

  • Tag enforcement is better managed by services like AWS Organizations, Service Control Policies (SCPs), or AWS Config, which track and enforce compliance policies.

In conclusion, AWS Trusted Advisor helps users by reducing costs through rightsizing recommendations and improving security through continuous checks. It’s an essential service for organizations seeking to optimize their AWS usage according to best practices.

SPECIAL OFFER: GET 10% OFF

ExamCollection Premium

ExamCollection Premium Files

Pass your Exam with ExamCollection's PREMIUM files!

  • ExamCollection Certified Safe Files
  • Guaranteed to have ACTUAL Exam Questions
  • Up-to-Date Exam Study Material - Verified by Experts
  • Instant Downloads
Enter Your Email Address to Receive Your 10% Off Discount Code
A Confirmation Link will be sent to this email address to verify your login
We value your privacy. We will not rent or sell your email address

SPECIAL OFFER: GET 10% OFF

Use Discount Code:

MIN10OFF

A confirmation link was sent to your e-mail.
Please check your mailbox for a message from support@examcollection.com and follow the directions.

Next

Download Free Demo of VCE Exam Simulator

Experience Avanset VCE Exam Simulator for yourself.

Simply submit your e-mail address below to get started with our interactive software demo of your free trial.

Free Demo Limits: In the demo version you will be able to access only first 5 questions from exam.