AWS-Certified-Security-Specialty Premium Bundle

AWS-Certified-Security-Specialty Premium Bundle

Amazon AWS Certified Security - Specialty Certification Exam

4.5 
(54105 ratings)
485 QuestionsPractice Tests
485 PDFPrint version
November 23, 2024Last update

Amazon AWS-Certified-Security-Specialty Free Practice Questions

Act now and download your Amazon AWS-Certified-Security-Specialty test today! Do not waste time for the worthless Amazon AWS-Certified-Security-Specialty tutorials. Download Refresh Amazon Amazon AWS Certified Security - Specialty exam with real questions and answers and begin to learn Amazon AWS-Certified-Security-Specialty with a classic professional.

Free AWS-Certified-Security-Specialty Demo Online For Amazon Certifitcation:

NEW QUESTION 1
A company requires that data stored in AWS be encrypted at rest. Which of the following approaches achieve this requirement? Select 2 answers from the options given below.
Please select:

  • A. When storing data in Amazon EBS, use only EBS-optimized Amazon EC2 instances.
  • B. When storing data in EBS, encrypt the volume by using AWS KMS.
  • C. When storing data in Amazon S3, use object versioning and MFA Delete.
  • D. When storing data in Amazon EC2 Instance Store, encrypt the volume by using KMS.
  • E. When storing data in S3, enable server-side encryptio

Answer: BE

Explanation:
The AWS Documentation mentions the following
To create an encrypted Amazon EBS volume, select the appropriate box in the Amazon EBS section of the Amazon EC2 console. You can use a custom customer master key (CMK) by choosing one from the list that appears below the encryption box. If you do not specify a custom CMK, Amazon EBS uses the AWS-managed CMK for Amazon EBS in your account. If there is no AWS-managed CMK for Amazon EBS in your account, Amazon EBS creates one.
Data protection refers to protecting data while in-transit (as it travels to and from Amazon S3) and at rest (while it is stored on disks in Amazon S3 data centers). You can protect data in transit by using
SSL or by using client-side encryption. You have the following options of protecting data at rest in Amazon S3.
• Use Server-Side Encryption - You request Amazon S3 to encrypt your object before saving it on disks in its data centers and decrypt it when you download the objects.
• Use Client-Side Encryption - You can encrypt data client-side and upload the encrypted data to Amazon S3. In this case, you manage the encryption process, the encryption keys, and related tools. Option A is invalid because using EBS-optimized Amazon EC2 instances alone will not guarantee protection of instances at rest. Option C is invalid because this will not encrypt data at rest for S3 objects. Option D is invalid because you don't store data in Instance store. For more information on EBS encryption, please visit the below URL: https://docs.aws.amazon.com/kms/latest/developerguide/services-ebs.html
For more information on S3 encryption, please visit the below URL: https://docs.aws.amazon.com/AmazonS3/latest/dev/UsinEEncryption.html
The correct answers are: When storing data in EBS, encrypt the volume by using AWS KMS. When storing data in S3, enable server-side encryption.
Submit your Feedback/Queries to our Experts

NEW QUESTION 2
A company has resources hosted in their AWS Account. There is a requirement to monitor all API activity for all regions. The audit needs to be applied for future regions as well. Which of the following can be used to fulfil this requirement.
Please select:

  • A. Ensure Cloudtrail for each regio
  • B. Then enable for each future region.
  • C. Ensure one Cloudtrail trail is enabled for all regions.
  • D. Create a Cloudtrail for each regio
  • E. Use Cloudformation to enable the trail for all future regions.
  • F. Create a Cloudtrail for each regio
  • G. Use AWS Config to enable the trail for all future region

Answer: B

Explanation:
The AWS Documentation mentions the following
You can now turn on a trail across all regions for your AWS account. CloudTrail will deliver log files from all regions to the Amazon S3 bucket and an optional CloudWatch Logs log group you specified. Additionally, when AWS launches a new region, CloudTrail will create the same trail in the new region. As a result you will receive log files containing API activity for the new region without taking any action.
Option A and C is invalid because this would be a maintenance overhead to enable cloudtrail for every region
Option D is invalid because this AWS Config cannot be used to enable trails For more information on this feature, please visit the following URL:
https://aws.ama2on.com/about-aws/whats-new/20l5/l2/turn-on-cloudtrail-across-all-reeions-andsupport- for-multiple-trails
The correct answer is: Ensure one Cloudtrail trail is enabled for all regions. Submit your Feedback/Queries to our Experts

NEW QUESTION 3
A company hosts data in S3. There is a requirement to control access to the S3 buckets. Which are the 2 ways in which this can be achieved?
Please select:

  • A. Use Bucket policies
  • B. Use the Secure Token service
  • C. Use 1AM user policies
  • D. Use AWS Access Keys

Answer: AC

Explanation:
The AWS Documentation mentions the following
Amazon S3 offers access policy options broadly categorized as resource-based policies and user policies. Access policies you attach to your resources (buckets and objects) are referred to as
resource-based policies. For example, bucket policies and access control lists (ACLs) are resourcebased policies. You can also attach access policies to users in your account. These are called user
policies. You may choose to use resource-based policies, user policies, or some combination of these to manage permissions to your Amazon S3 resources.
Option B and D are invalid because these cannot be used to control access to S3 buckets For more information on S3 access control, please refer to the below Link: https://docs.aws.amazon.com/AmazonS3/latest/dev/s3-access-control.htmll
The correct answers are: Use Bucket policies. Use 1AM user policies Submit your Feedback/Queries to our Experts

NEW QUESTION 4
A new application will be deployed on EC2 instances in private subnets. The application will transfer sensitive data to and from an S3 bucket. Compliance requirements state that the data must not traverse the public internet. Which solution meets the compliance requirement?
Please select:

  • A. Access the S3 bucket through a proxy server
  • B. Access the S3 bucket through a NAT gateway.
  • C. Access the S3 bucket through a VPC endpoint for S3
  • D. Access the S3 bucket through the SSL protected S3 endpoint

Answer: C

Explanation:
The AWS Documentation mentions the following
A VPC endpoint enables you to privately connect your VPC to supported AWS services and VPC endpoint services powered by PrivateLink without requiring an internet gateway, NAT device, VPN connection, or AWS Direct Connect connection. Instances in your VPC do not require public IP addresses to communicate with resources in the service. Traffic between your VPC and the other service does not leave the Amazon network.
Option A is invalid because using a proxy server is not sufficient enough
Option B and D are invalid because you need secure communication which should not traverse the internet
For more information on VPC endpoints please see the below link https://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/vpc-endpoints.htmll
The correct answer is: Access the S3 bucket through a VPC endpoint for S3 Submit your Feedback/Queries to our Experts

NEW QUESTION 5
You have a requirement to serve up private content using the keys available with Cloudfront. How can this be achieved?
Please select:

  • A. Add the keys to the backend distribution.
  • B. Add the keys to the S3 bucket
  • C. Create pre-signed URL's
  • D. Use AWS Access keys

Answer: C

Explanation:
Option A and B are invalid because you will not add keys to either the backend distribution or the S3 bucket.
Option D is invalid because this is used for programmatic access to AWS resources
You can use Cloudfront key pairs to create a trusted pre-signed URL which can be distributed to users Specifying the AWS Accounts That Can Create Signed URLs and Signed Cookies (Trusted Signers) Topics
• Creating CloudFront Key Pairs for Your Trusted Signers
• Reformatting the CloudFront Private Key (.NET and Java Only)
• Adding Trusted Signers to Your Distribution
• Verifying that Trusted Signers Are Active (Optional) 1 Rotating CloudFront Key Pairs
To create signed URLs or signed cookies, you need at least one AWS account that has an active CloudFront key pair. This accou is known as a trusted signer. The trusted signer has two purposes:
• As soon as you add the AWS account ID for your trusted signer to your distribution, CloudFront starts to require that users us signed URLs or signed cookies to access your objects.
' When you create signed URLs or signed cookies, you use the private key from the trusted signer's key pair to sign a portion of the URL or the cookie. When someone requests a restricted object CloudFront compares the signed portion of the URL or cookie with the unsigned portion to verify that the URL or cookie hasn't been tampered with. CloudFront also verifies that the URL or cookie is valid, meaning, for example, that the expiration date and time hasn't passed.
For more information on Cloudfront private trusted content please visit the following URL:
• https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-contenttrusted- s
The correct answer is: Create pre-signed URL's Submit your Feedback/Queries to our Experts

NEW QUESTION 6
A user has enabled versioning on an S3 bucket. The user is using server side encryption for data at
Rest. If the user is supplying his own keys for encryption SSE-C, which of the below mentioned statements is true?
Please select:

  • A. The user should use the same encryption key for all versions of the same object
  • B. It is possible to have different encryption keys for different versions of the same object
  • C. AWS S3 does not allow the user to upload his own keys for server side encryption
  • D. The SSE-C does not work when versioning is enabled

Answer: B

Explanation:
.anaging your own encryption keys, y
You can encrypt the object and send it across to S3
Option A is invalid because ideally you should use different encryption keys Option C is invalid because you can use you own encryption keys Option D is invalid because encryption works even if versioning is enabled For more information on client side encryption please visit the below Link: ""Keys.html
https://docs.aws.ama2on.com/AmazonS3/latest/dev/UsingClientSideEncryption.html
The correct answer is: It is possible to have different encryption keys for different versions of the same object Submit your Feedback/Queries to our Experts

NEW QUESTION 7
A company has hired a third-party security auditor, and the auditor needs read-only access to all AWS resources and logs of all VPC records and events that have occurred on AWS. How can the company meet the auditor's requirements without comprising security in the AWS environment? Choose the correct answer from the options below
Please select:

  • A. Create a role that has the required permissions for the auditor.
  • B. Create an SNS notification that sends the CloudTrail log files to the auditor's email when CIoudTrail delivers the logs to S3, but do not allow the auditor access to the AWS environment.
  • C. The company should contact AWS as part of the shared responsibility model, and AWS will grant required access to th^ third-party auditor.
  • D. Enable CloudTrail logging and create an 1AM user who has read-only permissions to the required AWS resources, including the bucket containing the CloudTrail logs.

Answer: D

Explanation:
AWS CloudTrail is a service that enables governance, compliance, operational auditing, and risk auditing of your AWS account. With CloudTrail, you can log, continuously monitor, and retain events related to API calls across your AWS infrastructure. CloudTrail provides a history of AWS API calls for your account including API calls made through the AWS Management Console, AWS SDKs, command line tools, and other AWS services. This history simplifies security analysis, resource change tracking, and troubleshooting.
Option A and C are incorrect since Cloudtrail needs to be used as part of the solution Option B is incorrect since the auditor needs to have access to Cloudtrail
For more information on cloudtrail, please visit the below URL: https://aws.amazon.com/cloudtraiL
The correct answer is: Enable CloudTrail logging and create an 1AM user who has read-only permissions to the required AWS resources, including the bucket containing the CloudTrail logs. Submit your Feedback/Queries to our Experts

NEW QUESTION 8
Your company has defined a set of S3 buckets in AWS. They need to monitor the S3 buckets and know the source IP address and the person who make requests to the S3 bucket. How can this be achieved?
Please select:

  • A. Enable VPC flow logs to know the source IP addresses
  • B. Monitor the S3 API calls by using Cloudtrail logging
  • C. Monitor the S3 API calls by using Cloudwatch logging
  • D. Enable AWS Inspector for the S3 bucket

Answer: B

Explanation:
The AWS Documentation mentions the following
Amazon S3 is integrated with AWS CloudTrail. CloudTrail is a service that captures specific API calls made to Amazon S3 from your AWS account and delivers the log files to an Amazon S3 bucket that you specify. It captures API calls made from the Amazon S3 console or from the Amazon S3 API. Using the information collected by CloudTrail, you can determine what request was made to Amazon S3, the source IP address from which the request was made, who made the request when it was
made, and so on
Options A,C and D are invalid because these services cannot be used to get the source IP address of the calls to S3 buckets
For more information on Cloudtrail logging, please refer to the below Link:
https://docs.aws.amazon.com/AmazonS3/latest/dev/cloudtrail-logeins.htmll
The correct answer is: Monitor the S3 API calls by using Cloudtrail logging Submit your Feedback/Queries to our Experts

NEW QUESTION 9
You are designing a connectivity solution between on-premises infrastructure and Amazon VPC. Your server's on-premises will be communicating with your VPC instances. You will be establishing IPSec
tunnels over the internet. Yo will be using VPN gateways and terminating the IPsec tunnels on AWSsupported customer gateways. Which of the following objectives would you achieve by
implementing an IPSec tunnel as outlined above? Choose 4 answers form the options below Please select:

  • A. End-to-end protection of data in transit
  • B. End-to-end Identity authentication
  • C. Data encryption across the internet
  • D. Protection of data in transit over the Internet
  • E. Peer identity authentication between VPN gateway and customer gateway
  • F. Data integrity protection across the Internet

Answer: CDEF

Explanation:
Since the Web server needs to talk to the database server on port 3306 that means that the database server should allow incoming traffic on port 3306. The below table from the aws documentation shows how the security groups should be set up.
AWS-Security-Specialty dumps exhibit
Option B is invalid because you need to allow incoming access for the database server from the WebSecGrp security group.
Options C and D are invalid because you need to allow Outbound traffic and not inbound traffic For more information on security groups please visit the below Link: http://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/VPC Scenario2.html
The correct answer is: Allow Inbound on port 3306 for Source Web Server Security Group WebSecGrp. Submit your Feedback/Queries to our Experts

NEW QUESTION 10
Company policy requires that all insecure server protocols, such as FTP, Telnet, HTTP, etc be disabled on all servers. The security team would like to regularly check all servers to ensure compliance with this requirement by using a scheduled CloudWatch event to trigger a review of the current infrastructure. What process will check compliance of the company's EC2 instances?
Please select:

  • A. Trigger an AWS Config Rules evaluation of the restricted-common-ports rule against every EC2 instance.
  • B. Query the Trusted Advisor API for all best practice security checks and check for "action recommened" status.
  • C. Enable a GuardDuty threat detection analysis targeting the port configuration on every EC2 instance.
  • D. Run an Amazon inspector assessment using the Runtime Behavior Analysis rules package against every EC2 instance.

Answer: D

Explanation:
Option B is incorrect because querying Trusted Advisor API's are not possible
Option C is incorrect because GuardDuty should be used to detect threats and not check the compliance of security protocols.
Option D states that Run Amazon Inspector using runtime behavior analysis rules which will analyze the behavior of your instances during an assessment run, and provide guidance about how to make your EC2 instances more secure.
Insecure Server Protocols
This rule helps determine whether your EC2 instances allow support for insecure and unencrypted ports/services such as FTP, Telnet HTTP, IMAP, POP version 3, SMTP, SNMP versions 1 and 2, rsh, and rlogin.
For more information, please refer to below URL: https://docs.aws.amazon.eom/mspector/latest/userguide/inspector_runtime-behavioranalysis. html#insecure-protocols
(
The correct answer is: Run an Amazon Inspector assessment using the Runtime Behavior Analysis rules package against every EC2 instance.
Submit your Feedback/Queries to our Experts

NEW QUESTION 11
Which of the below services can be integrated with the AWS Web application firewall service. Choose 2 answers from the options given below
Please select:

  • A. AWS Cloudfront
  • B. AWS Lambda
  • C. AWS Application Load Balancer
  • D. AWS Classic Load Balancer

Answer: AC

Explanation:
The AWS documentation mentions the following on the Application Load Balancer
AWS WAF can be deployed on Amazon CloudFront and the Application Load Balancer (ALB). As part of Amazon CloudFront it car be part of your Content Distribution Network (CDN) protecting your resources and content at the Edge locations and as part of the Application Load Balancer it can
protect your origin web servers running behind the ALBs.
Options B and D are invalid because only Cloudfront and the Application Load Balancer services are supported by AWS WAF.
For more information on the web application firewall please refer to the below URL: https://aws.amazon.com/waf/faq;
The correct answers are: AWS Cloudfront AWS Application Load Balancer Submit your Feedback/Queries to our Experts

NEW QUESTION 12
A company has a legacy application that outputs all logs to a local text file. Logs from all applications running on AWS
must be continually monitored for security related messages.
What can be done to allow the company to deploy the legacy application on Amazon EC2 and still meet the monitoring
requirement? Please select:

  • A. Create a Lambda function that mounts the EBS volume with the logs and scans the logs for security incident
  • B. Trigger the function every 5 minutes with a scheduled Cloudwatch event.
  • C. Send the local text log files to CloudWatch Logs and configure a CloudWatch metric filte
  • D. Trigger cloudwatch alarms based on the metrics.
  • E. Install the Amazon inspector agent on any EC2 instance running the legacy applicatio
  • F. Generate CloudWatch alerts a based on any Amazon inspector findings.
  • G. Export the local text log files to CloudTrai
  • H. Create a Lambda function that queries the CloudTrail logs for security ' incidents using Athena.

Answer: B

Explanation:
One can send the log files to Cloudwatch Logs. Log files can also be sent from On-premise servers. You can then specify metrii to search the logs for any specific values. And then create alarms based on these metrics.
Option A is invalid because this will be just a long over drawn process to achieve this requirement Option C is invalid because AWS Inspector cannot be used to monitor for security related messages. Option D is invalid because files cannot be exported to AWS Cloudtrail
For more information on Cloudwatch logs agent please visit the below URL: https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/QuickStartEC2lnstance.hti
The correct answer is: Send the local text log files to Cloudwatch Logs and configure a Cloudwatch metric filter. Trigger cloudwatch alarms based on the metrics.
Submit your Feedback/Queries to our Experts

NEW QUESTION 13
When managing permissions for the API gateway, what can be used to ensure that the right level of permissions are given to developers, IT admins and users? These permissions should be easily managed.
Please select:

  • A. Use the secure token service to manage the permissions for the different users
  • B. Use 1AM Policies to create different policies for the different types of users.
  • C. Use the AWS Config tool to manage the permissions for the different users
  • D. Use 1AM Access Keys to create sets of keys for the different types of user

Answer: B

Explanation:
The AWS Documentation mentions the following
You control access to Amazon API Gateway with 1AM permissions by controlling access to the following two API Gateway component processes:
* To create, deploy, and manage an API in API Gateway, you must grant the API developer permissions to perform the required actions supported by the API management component of API Gateway.
* To call a deployed API or to refresh the API caching, you must grant the API caller permissions to perform required 1AM actions supported by the API execution component of API Gateway.
Option A, C and D are invalid because these cannot be used to control access to AWS services. This needs to be done via policies. For more information on permissions with the API gateway, please visit the following URL: https://docs.aws.amazon.com/apisateway/latest/developerguide/permissions.html
The correct answer is: Use 1AM Policies to create different policies for the different types of users. Submit your Feedback/Queries to our Experts

NEW QUESTION 14
A customer has an instance hosted in the AWS Public Cloud. The VPC and subnet used to host the Instance have been created with the default settings for the Network Access Control Lists. They need to provide an IT Administrator secure access to the underlying instance. How can this be accomplished.
Please select:

  • A. Ensure the Network Access Control Lists allow Inbound SSH traffic from the IT Administrator's Workstation
  • B. Ensure the Network Access Control Lists allow Outbound SSH traffic from the IT Administrator's Workstation
  • C. Ensure that the security group allows Inbound SSH traffic from the IT Administrator's Workstation
  • D. Ensure that the security group allows Outbound SSH traffic from the IT Administrator's Workstation

Answer: C

Explanation:
Options A & B are invalid as default NACL rule will allow all inbound and outbound traffic.
The requirement is that the IT administrator should be able to access this EC2 instance from his workstation. For that we need to enable the Security Group of EC2 instance to allow traffic from the IT administrator's workstation. Hence option C is correct.
Option D is incorrect as we need to enable the Inbound SSH traffic on the EC2 instance Security Group since the traffic originate' , from the IT admin's workstation.
The correct answer is: Ensure that the security group allows Inbound SSH traffic from the IT Administrator's Workstation Submit your Feedback/Queries to our Experts

NEW QUESTION 15
Your developer is using the KMS service and an assigned key in their Java program. They get the below error when
running the code
arn:aws:iam::113745388712:user/UserB is not authorized to perform: kms:DescribeKey Which of the following could help resolve the issue?
Please select:

  • A. Ensure that UserB is given the right IAM role to access the key
  • B. Ensure that UserB is given the right permissions in the IAM policy
  • C. Ensure that UserB is given the right permissions in the Key policy
  • D. Ensure that UserB is given the right permissions in the Bucket policy

Answer: C

Explanation:
You need to ensure that UserB is given access via the Key policy for the Key
AWS-Security-Specialty dumps exhibit
Option is invalid because you don't assign roles to 1AM users For more information on Key policies please visit the below Link: https://docs.aws.amazon.com/kms/latest/developerguide/key-poli
The correct answer is: Ensure that UserB is given the right permissions in the Key policy

NEW QUESTION 16
......

P.S. Easily pass AWS-Certified-Security-Specialty Exam with 485 Q&As Dumps-hub.com Dumps & pdf Version, Welcome to Download the Newest Dumps-hub.com AWS-Certified-Security-Specialty Dumps: https://www.dumps-hub.com/AWS-Certified-Security-Specialty-dumps.html (485 New Questions)


START AWS-Certified-Security-Specialty EXAM