SAA-C03 Premium Bundle

SAA-C03 Premium Bundle

AWS Certified Solutions Architect - Associate (SAA-C03) Certification Exam

4.5 
(56970 ratings)
0 QuestionsPractice Tests
0 PDFPrint version
December 4, 2024Last update

Amazon-Web-Services SAA-C03 Free Practice Questions

100% Guarantee of SAA-C03 free practice questions materials and free samples for Amazon-Web-Services certification for IT candidates, Real Success Guaranteed with Updated SAA-C03 pdf dumps vce Materials. 100% PASS AWS Certified Solutions Architect - Associate (SAA-C03) exam Today!

Amazon-Web-Services SAA-C03 Free Dumps Questions Online, Read and Test Now.

NEW QUESTION 1
A company needs guaranteed Amazon EC2 capacity in three specific Availability Zones in a specific AWS Region for an upcoming event that will last 1 week.
What should the company do to guarantee the EC2 capacity?

  • A. Purchase Reserved instances that specify the Region needed
  • B. Create an On Demand Capacity Reservation that specifies the Region needed
  • C. Purchase Reserved instances that specify the Region and three Availability Zones needed
  • D. Create an On-Demand Capacity Reservation that specifies the Region and three Availability Zones needed

Answer: D

Explanation:
Explanation
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-capacity-reservations.html: "When you create a Capacity Reservation, you specify:
The Availability Zone in which to reserve the capacity"

NEW QUESTION 2
A company runs an online marketplace web application on AWS. The application serves hundreds of thousands of users during peak hours. The company needs a scalable, near-real-time solution to share the details of millions of financial transactions with several other internal applications Transactions also need to be processed to remove sensitive data before being stored in a document database for low-latency retrieval.
What should a solutions architect recommend to meet these requirements?

  • A. Store the transactions data into Amazon DynamoDB Set up a rule in DynamoDB to remove sensitive data from every transaction upon write Use DynamoDB Streams to share the transactions data with other applications
  • B. Stream the transactions data into Amazon Kinesis Data Firehose to store data in Amazon DynamoDB and Amazon S3 Use AWS Lambda integration with Kinesis Data Firehose to remove sensitive dat
  • C. Other applications can consumethe data stored in Amazon S3
  • D. Stream the transactions data into Amazon Kinesis Data Streams Use AWS Lambda integration to remove sensitive data from every transaction and then store the transactions data in Amazon DynamoDB Other applications can consumethe transactions data off the Kinesis data stream.
  • E. Store the batched transactions data in Amazon S3 as file
  • F. Use AWS Lambda to process every file and remove sensitive data before updating the files in Amazon S3 The Lambda function then stores the data in Amazon DynamoDBOther applications can consume transaction files stored in Amazon S3.

Answer: C

Explanation:
Explanation
The destination of your Kinesis Data Firehose delivery stream. Kinesis Data Firehose can send data records to various destinations, including Amazon Simple Storage Service (Amazon S3), Amazon Redshift, Amazon OpenSearch Service,
and any HTTP endpoint that is owned by you or any of your third-party service providers. The following are the supported destinations:
* Amazon OpenSearch Service
* Amazon S3
* Datadog
* Dynatrace
* Honeycomb
* HTTP Endpoint
* Logic Monitor
* MongoDB Cloud
* New Relic
* Splunk
* Sumo Logic
https://docs.aws.amazon.com/firehose/latest/dev/create-name.html
https://aws.amazon.com/kinesis/data-streams/
Amazon Kinesis Data Streams (KDS) is a massively scalable and durable real-time data streaming service. KDS can continuously capture gigabytes of data per second from hundreds of thousands of sources such as website clickstreams, database event streams, financial transactions, social media feeds, IT logs, and location-tracking events.

NEW QUESTION 3
A company is migrating a distributed application to AWS The application serves variable workloads The legacy platform consists of a primary server trial coordinates jobs across multiple compute nodes The company wants to modernize the application with a solution that maximizes resiliency and scalability
How should a solutions architect design the architecture to meet these requirements?

  • A. Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs Implement the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling grou
  • B. Configure EC2 Auto Scaling to use scheduled scaling
  • C. Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs Implement the compute nodes with Amazon EC2 Instances that are managed in an Auto Scaling group Configure EC2 Auto Scaling based on the size of the queue
  • D. Implement the primary server and the compute nodes with Amazon EC2 instances that are managed In an Auto Scaling grou
  • E. Configure AWS CloudTrail as a destination for the fobs Configure EC2 Auto Scaling based on the load on the primary server
  • F. implement the primary server and the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling group Configure Amazon EventBridge (Amazon CloudWatch Events) as a destination for the jobs Configure EC2 Auto Scaling based on the load on the compute nodes

Answer: B

NEW QUESTION 4
A company's reporting system delivers hundreds of csv files to an Amazon S3 bucket each day The company must convert these files to Apache Parquet format and must store the files in a transformed data bucket.
Which solution will meet these requirements with the LEAST development effort?

  • A. Create an Amazon EMR cluster with Apache Spark installed Write a Spark application to transform the data Use EMR File System (EMRFS) to write files to the transformed data bucket
  • B. Create an AWS Glue crawler to discover the data Create an AWS Glue extract transform: and load (ETL) job to transform the data Specify the transformed data bucket in the output step
  • C. Use AWS Batch to create a job definition with Bash syntax to transform the data and output the data to the transformed data bucketUse the job definition to submit a job Specify an array job as the job type
  • D. Create an AWS Lambda function to transform the data and output the data to the transformed data bucke
  • E. Configure an event notification for the S3 bucke
  • F. Specify the Lambda function as the destination for the event notification.

Answer: D

NEW QUESTION 5
A company is creating a new application that will store a large amount of data. The data will be analyzed hourly and will be modified by several Amazon EC2 Linux instances that are deployed across multiple Availability Zones. The needed amount of storage space will continue to grow for the next 6 Months.
Which storage solution should a solutions architect recommend to meet these requirements?

  • A. Store the data in Amazon S3 Glacier Update me S3 Glacier vault policy to allow access to the application Instances
  • B. Store the data in an Amazon Elastic Block Store (Amazon EBS) volume Mount the EBS volume on the application nuances.
  • C. Store the data in an Amazon Elastic File System (Amazon EFS) tile system Mount the file system on the application instances.
  • D. Store the data in an Amazon Elastic Block Store (Amazon EBS) Provisioned K)PS volume shared between the application instances.

Answer: C

NEW QUESTION 6
A company has a business-critical application that runs on Amazon bC2 instances. The application stores data m an Amazon DynamoDB table. The company must be able to revert the table to any point within the last 24 hours. Which solution meets these requirements with the LEAST operational overhead?

  • A. Configure point-in-time recovery for the fabric
  • B. Use AWS Backup for the table
  • C. Use an AWS Lambda function to make an on demand backup of the table every hour
  • D. Turn on streams on the table to capture a log of all changes to the table in the last 24 hour
  • E. Store a copy of the stream in an Amazon S3 bucket

Answer: A

NEW QUESTION 7
A company that hosts its web application on AWS wants to ensure all Amazon EC2 instances. Amazon RDS DB instances. and Amazon Redshift clusters are configured with tags. The company wants to minimize the effort of configuring and operating this check.
What should a solutions architect do to accomplish this?

  • A. Use AWS Config rules to define and detect resources that are not properly tagged.
  • B. Use Cost Explorer to display resources that are not properly tagge
  • C. Tag those resources manually.
  • D. Write API calls to check all resources for proper tag allocatio
  • E. Periodically run the code on an EC2 instance.
  • F. Write API calls to check all resources for proper tag allocatio
  • G. Schedule an AWS Lambda function through Amazon CloudWatch to periodically run the code.

Answer: A

NEW QUESTION 8
A company wants to run applications in container in the AWS Cloud. Those applications arc stateless and can tolerate disruptions. What should a solutions architect do to meet those requirements?
What should a solution architect do to meet these requirements?

  • A. Use Spot Instances in an Amazon EC2 Auto Scaling group to run the application containers
  • B. Use Spot Instances in an Amazon Elastic Kubernetes Service (Amazon EKS) managed node group
  • C. Use On-Demand Instances in an Amazon EC2 Auto Scaling group to run the application containers
  • D. Use On-Demand Instances in an Amazon Elastic Kubernetes Service (Amazon EKS) managed node group.

Answer: A

NEW QUESTION 9
A company is building an ecommerce application and needs to store sensitive customer information. The company needs to give customers the ability to complete purchase transactions on the website. The company also needs to ensure that sensitive customer data is protected, even from database administrators.
Which solution meets these requirements?

  • A. Store sensitive data in an Amazon Elastic Block Store (Amazon EBS) volum
  • B. Use EBS encryption to encrypt the dat
  • C. Use an IAM instance role to restrict access.
  • D. Store sensitive data in Amazon RDS for MySQ
  • E. Use AWS Key Management Service (AWS KMS) client-side encryption to encrypt the data.
  • F. Store sensitive data in Amazon S3. Use AWS Key Management Service (AWS KMS) service-side encryption the dat
  • G. Use S3 bucket policies to restrict access.
  • H. Store sensitive data in Amazon FSx for Windows Serve
  • I. Mount the file share on application servers.Use Windows file permissions to restrict access.

Answer: C

NEW QUESTION 10
A company hosts its web applications in the AWS Cloud. The company configures Elastic Load Balancers to use certificate that are imported into AWS Certificate Manager (ACM). The company’s security team must be notified 30 days before the expiration of each certificate.
What should a solutions architect recommend to meet the requirement?

  • A. Add a rule m ACM to publish a custom message to an Amazon Simple Notification Service (Amazon SNS) topic every day beginning 30 days before any certificate will expire.
  • B. Create an AWS Config rule that checks for certificates that will expire within 30 day
  • C. Configure Amazon EventBridge (Amazon CloudWatch Events) to invoke a custom alert by way of Amazon Simple Notification Service (Amazon SNS) when AWS Config reports a noncompliant resource
  • D. Use AWS trusted Advisor to check for certificates that will expire within to day
  • E. Create an Amazon CloudWatch alarm that is based on Trusted Advisor metrics for check status changes Configure the alarm to send a custom alert by way of Amazon Simple rectification Service (Amazon SNS)
  • F. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to detect any certificates that will expire within 30 day
  • G. Configure the rule to invoke an AWS Lambda functio
  • H. Configure the Lambda function to send a custom alert by way of Amazon Simple Notification Service (Amazon SNS).

Answer: B

NEW QUESTION 11
A company has an application that runs on Amazon EC2 instances and uses an Amazon Aurora database. The EC2 instances connect to the database by using user names and passwords that are stored locally in a file. The company wants to minimize the operational overhead of credential management.
What should a solutions architect do to accomplish this goal?

  • A. Use AWS Secrets Manage
  • B. Turn on automatic rotation.
  • C. Use AWS Systems Manager Parameter Stor
  • D. Turn on automatic rotatio
  • E. • Create an Amazon S3 bucket lo store objects that are encrypted with an AWS Key
  • F. Management Service (AWS KMS) encryption ke
  • G. Migrate the credential file to the S3 bucke
  • H. Point the application to the S3 bucket.
  • I. Create an encrypted Amazon Elastic Block Store (Amazon EBS) volume (or each EC2 instanc
  • J. Attach the new EBS volume to each EC2 instanc
  • K. Migrate the credential file to the new EBS volum
  • L. Point the application to the new EBS volume.

Answer: C

NEW QUESTION 12
A business's backup data totals 700 terabytes (TB) and is kept in network attached storage (NAS) at its data center. This backup data must be available in the event of occasional regulatory inquiries and preserved for a period of seven years. The organization has chosen to relocate its backup data from its on-premises data center to Amazon Web Services (AWS). Within one month, the migration must be completed. The company's public internet connection provides 500 Mbps of dedicated capacity for data transport.
What should a solutions architect do to ensure that data is migrated and stored at the LOWEST possible cost?

  • A. Order AWS Snowball devices to transfer the dat
  • B. Use a lifecycle policy to transition the files to Amazon S3 Glacier Deep Archive.
  • C. Deploy a VPN connection between the data center and Amazon VP
  • D. Use the AWS CLI to copy the data from on premises to Amazon S3 Glacier.
  • E. Provision a 500 Mbps AWS Direct Connect connection and transfer the data to Amazon S3. Use a lifecycle policy to transition the files to Amazon S3 Glacier Deep Archive.
  • F. Use AWS DataSync to transfer the data and deploy a DataSync agent on premise
  • G. Use the DataSync task to copy files from the on-premises NAS storage to Amazon S3 Glacier.

Answer: A

NEW QUESTION 13
A public-facing web application queries a database hosted on a Amazon EC2 instance in a private subnet. A large number of queries involve multiple table joins, and the application performance has been degrading due to an increase in complex queries. The application team will be performing updates to improve performance.
What should a solutions architect recommend to the application team? (Select TWO.)

  • A. Cache query data in Amazon SQS
  • B. Create a read replica to offload queries
  • C. Migrate the database to Amazon Athena
  • D. Implement Amazon DynamoDB Accelerator to cache data.
  • E. Migrate the database to Amazon RDS

Answer: BE

NEW QUESTION 14
A company is planning on deploying a newly built application on AWS in a default VPC. The application will consist of a web layer and database layer. The web server was created in public subnets, and the MySQL database was created in private subnet. All subnets are created with the default network ACL settings, and the default security group in the VPC will be replaced with new custom security groups.

  • A. Create a database server security group with inbound and outbound rules for MySQL port 3306 traffic to and from anywhere (0.0.0.0/0).
  • B. Create a database server security group with an inbound rule for MySQL port 3300 and specify the source as a web server security group.
  • C. Create a web server security group within an inbound allow rule for HTTPS port 443 traffic from anywbere (0.0.0.0/0) and an inbound deny rule for IP range 182. 20.0.0/16
  • D. Create a web server security group with an inbound rule for HTTPS port 443 traffic from anywhere (0.0.0.0/0). Create network ACL inbound and outbound deny rules for IP range 182. 20.0.0/16
  • E. Create a web server security group with an inbound and outbound rules for HTTPS port 443 traffic to and from anywbere (0.0.0.0/0). Create a network ACL inbound deny rule for IP range 182. 20.0.0/16.

Answer: BD

NEW QUESTION 15
A company uses NFS to store large video files in on-premises network attached storage. Each video file ranges in size from 1MB to 500 GB. The total storage is 70 TB and is no longer growing. The company decides to migrate the video files to Amazon S3. The company must migrate the video files as soon as possible while using the least possible network bandwidth.
Which solution will meet these requirements?

  • A. Create an S3 bucket Create an 1AM role that has permissions to write to the S3 bucke
  • B. Use the AWS CLI to copy all files locally to the S3 bucket.
  • C. Create an AWS Snowball Edge jo
  • D. Receive a Snowball Edge device on premise
  • E. Use the Snowball Edge client to transfer data to the devic
  • F. Return the device so that AWS can import the data intoAmazon S3.
  • G. Deploy an S3 File Gateway on premise
  • H. Create a public service endpoint to connect to the S3 File Gateway Create an S3 bucket Create a new NFS file share on the S3 File Gateway Point the new file share to the S3 bucke
  • I. Transfer the data from the existing NFS file share to the S3 File Gateway.
  • J. Set up an AWS Direct Connect connection between the on-premises network and AW
  • K. Deploy an S3 File Gateway on premise
  • L. Create a public virtual interlace (VIF) to connect to the S3 File Gatewa
  • M. Create an S3 bucke
  • N. Create a new NFS file share on the S3 File Gatewa
  • O. Point the new file share to the S3 bucke
  • P. Transfer the data from the existing NFS file share to the S3 File Gateway.

Answer: C

NEW QUESTION 16
A company wants to build a scalable key management Infrastructure to support developers who need to encrypt data in their applications.
What should a solutions architect do to reduce the operational burden?

  • A. Use multifactor authentication (MFA) to protect the encryption keys.
  • B. Use AWS Key Management Service (AWS KMS) to protect the encryption keys
  • C. Use AWS Certificate Manager (ACM) to create, store, and assign the encryption keys
  • D. Use an IAM policy to limit the scope of users who have access permissions to protect the encryption keys

Answer: B

NEW QUESTION 17
A company wants to run a gaming application on Amazon EC2 instances that are part of an Auto Scaling group in the AWS Cloud. The application will transmit data by using UDP packets. The company wants to ensure that the application can scale out and in as traffic increases and decreases.
What should a solutions architect do to meet these requirements?

  • A. Attach a Network Load Balancer to the Auto Scaling group
  • B. Attach an Application Load Balancer to the Auto Scaling group.
  • C. Deploy an Amazon Route 53 record set with a weighted policy to route traffic appropriately
  • D. Deploy a NAT instance that is configured with port forwarding to the EC2 instances in the Auto Scaling group.

Answer: B

NEW QUESTION 18
An image-processing company has a web application that users use to upload images. The application uploads the images into an Amazon S3 bucket. The company has set up S3 event notifications to publish the object creation events to an A company has a service that produces event queue. The SQS queue serves as the event source for an AWS Lambda function that processes the images and sends the results to users through email.
Users report that they are receiving multiple email messages for every uploaded image. A solutions architect determines that SQS messages are invoking the Lambda function more than once, resulting in multiple email messages.
What should the solutions architect do to resolve this issue with the LEAST operational overhead?

  • A. Set up long polling in the SQS queue by increasing the ReceiveMessage wait time to 30 seconds.
  • B. Change the SQS standard queue to an SQS FIFO queu
  • C. Use the message deduplication ID to discard duplicate messages.
  • D. Increase the visibility timeout in the SQS queue to a value that is greater than the total of the function timeout and the batch window timeout.
  • E. Modify the Lambda function to delete each message from the SQS queue immediately after the message is read before processing.

Answer: B

NEW QUESTION 19
A company is designing a new web application that the company will deploy into a single AWS Region. The application requires a two-tier architecture that will include Amazon EC2 instances and an Amazon RDS DB instance. A solutions architect needs to design the application so that all components are highly available.

  • A. Deploy EC2 instances In an additional Region Create a DB instance with the Multi-AZ option activated
  • B. Deploy all EC2 instances in the same Region and the same Availability Zon
  • C. Create a DB instance with the Multi-AZ option activated.
  • D. Deploy the fcC2 instances across at least two Availability Zones within the some Regio
  • E. Create a DB instance in a single Availability Zone
  • F. Deploy the EC2 instances across at least Two Availability Zones within the same Regio
  • G. Create a DB instance with the Multi-AZ option activated

Answer: D

NEW QUESTION 20
An online retail company has more than 50 million active customers and receives more than 25,000 orders each day. The company collects purchase data for customers and stores this data in Amazon S3. Additional customer data is stored in Amazon RDS.
The company wants to make all the data available to various teams so that the teams can perform analytics. The solution must provide the ability to manage fine-grained permissions for the data and must minimize operational overhead.
Which solution will meet these requirements?

  • A. Migrate the purchase data to write directly to Amazon RD
  • B. Use RDS access controls to limit access.
  • C. Schedule an AWS Lambda function to periodically copy data from Amazon RDS to Amazon S3. Create an AWS Glue crawle
  • D. Use Amazon Athena to query the dat
  • E. Use S3 policies to limit access.
  • F. Create a data lake by using AWS Lake Formatio
  • G. Create an AWS Glue JOBC connection to Amazon RD
  • H. Register the S3 bucket in Lake Formatio
  • I. Use Lake
  • J. Formation access controls to limit acces
  • K. Create an Amazon Redshift cluster Schedule an AWS Lambda function to periodically copy data from Amazon S3 and Amazon RDS to Amazon Redshif
  • L. Use Amazon Redshift access controls to limit access.

Answer: C

NEW QUESTION 21
......

Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From Dumps-files.com, Welcome to Download: https://www.dumps-files.com/files/SAA-C03/ (New 0 Q&As Version)


START SAA-C03 EXAM