Top Tips Of Avant-garde SAA-C03 Sample Question

Want to know Pass4sure SAA-C03 Exam practice test features? Want to lear more about Amazon-Web-Services AWS Certified Solutions Architect - Associate (SAA-C03) certification experience? Study Verified Amazon-Web-Services SAA-C03 answers to Up to the minute SAA-C03 questions at Pass4sure. Gat a success with an absolute guarantee to pass Amazon-Web-Services SAA-C03 (AWS Certified Solutions Architect - Associate (SAA-C03)) test on your first attempt.

Also have SAA-C03 free dumps questions for you:

NEW QUESTION 1
A company wants to use Amazon S3 for the secondary copy of itdataset. The company would rarely need to access this copy. The storage solution’s
cost should be minimal.
Which storage solution meets these requirements?

  • A. S3 Standard
  • B. S3 Intelligent-Tiering
  • C. S3 Standard-Infrequent Access (S3 Standard-IA)
  • D. S3 One Zone-Infrequent Access (S3 One Zone-IA)

Answer: C

NEW QUESTION 2
A company stores call transcript files on a monthly basis. Users access the files randomly within 1 year of the call, but users access the files infrequently after 1 year. The company wants to optimize its solution by giving users the ability to query and retrieve files that are less than 1-year-old as quickly as possible. A delay in retrieving older files is acceptable.
Which solution will meet these requirements MOST cost-effectively?

  • A. Store individual files with tags in Amazon S3 Glacier Instant Retrieva
  • B. Query the tags to retrieve the files from S3 Glacier Instant Retrieval.
  • C. Store individual files in Amazon S3 Intelligent-Tierin
  • D. Use S3 Lifecycle policies to move the files to S3 Glacier Flexible Retrieval after 1 yea
  • E. Query and retrieve the files that are in Amazon S3 by using Amazon Athen
  • F. Query and retrieve the files that are in S3 Glacier by using S3 Glacier Select.
  • G. Store individual files with tags in Amazon S3 Standard storag
  • H. Store search metadata for each archive in Amazon S3 Standard storag
  • I. Use S3 Lifecycle policies to move the files to S3 Glacier Instant Retrieval after 1 yea
  • J. Query and retrieve the files by searching for metadata from Amazon S3.
  • K. Store individual files in Amazon S3 Standard storag
  • L. Use S3 Lifecycle policies to move the files to S3 Glacier Deep Archive after 1 yea
  • M. Store search metadata in Amazon RD
  • N. Query the files from Amazon RD
  • O. Retrieve the files from S3 Glacier Deep Archive.

Answer: C

NEW QUESTION 3
A company uses Amazon S3 as its data lake. The company has a new partner that must use SFTP to upload data files A solutions architect needs to implement a highly available SFTP solution that minimizes operational overhead
Which solution will meet these requirements?

  • A. Use AWS Transfer Family to configure an SFTP-enabled server with a publicly accessible endpoint Choose the S3 data lake as the destination
  • B. Use Amazon S3 File Gateway as an SFTP server Expose the S3 File Gateway endpoint URL to the new partner Share the S3 File Gateway endpoint with the new partner
  • C. Launch an Amazon EC2 instance in a private subnet in a VPC Instruct the new partner to upload files to the EC2 instance by using a VPN Run a cron job script on the EC2 instance to upload files to the S3 data lake
  • D. Launch Amazon EC2 instances in a private subnet in a VPC Place a Network Load Balancer (NLB) in front of the EC2 instances Create an SFTP listener port for the NLB Share the NLB hostname with the new partne
  • E. Run a cron job script on the EC2 instances to upload files to the S3 data lake

Answer: A

NEW QUESTION 4
A company has a data ingestion workflow that consists the following:
An Amazon Simple Notification Service (Amazon SNS) topic for notifications about new data deliveries An AWS Lambda function to process the data and record metadata The company observes that the ingestion workflow fails occasionally because of network connectivity issues. When such a failure occurs, the Lambda function does not ingest the corresponding data unless the company manually reruns the job.
Which combination of actions should a solutions architect take to ensure that the Lambda function ingests all data in the future? (Select TWO.)

  • A. Configure the Lambda function In multiple Availability Zones.
  • B. Create an Amazon Simple Queue Service (Amazon SQS) queue, and subscribe It to me SNS topic.
  • C. Increase the CPU and memory that are allocated to the Lambda function.
  • D. Increase provisioned throughput for the Lambda function.
  • E. Modify the Lambda function to read from an Amazon Simple Queue Service (Amazon SQS) queue

Answer: BE

NEW QUESTION 5
A company has on-premises servers that run a relational database The database serves high-read traffic for users in different locations The company wants to migrate the database to AWS with the least amount of effort The database solution must support high availability and must not affect the company's current traffic flow
Which solution meets these requirements?

  • A. Use a database in Amazon RDS with Multi-AZ and at least one read replica.
  • B. Use a database in Amazon RDS with Multi-AZ and at least one standby replica.
  • C. Use databases that are hosted on multiple Amazon EC2 instances in different AWS Regions.
  • D. Use databases that are hosted on Amazon EC2 instances behind an Application Load Balancer in different Availability Zones

Answer: A

Explanation:
https://aws.amazon.com/blogs/database/implementing-a-disaster-recovery-strategy-with-amazon-rds/

NEW QUESTION 6
A research company runs experiments that are powered by a simulation application and a visualization application. The simulation application runs on Linux and outputs intermediate data to an NFS share every 5 minutes. The visualization application is a Windows desktop application that displays the simulation output and requires an SMB file system.
The company maintains two synchronized tile systems. This strategy is causing data duplication and inefficient resource usage. The company needs to migrate the applications to AWS without making code changes to either application.
Which solution will meet these requirements?

  • A. Migrate both applications to AWS Lambda Create an Amazon S3 bucket to exchange data between the applications.
  • B. Migrate both applications to Amazon Elastic Container Service (Amazon ECS). Configure Amazon FSx File Gateway for storage.
  • C. Migrate the simulation application to Linux Amazon EC2 instance
  • D. Migrate the visualization application to Windows EC2 instance
  • E. Configure Amazon Simple Queue Service (Amazon SOS) to exchange data between the applications.
  • F. Migrate the simulation application to Linux Amazon EC2 instance
  • G. Migrate the visualization application to Windows EC2 instance
  • H. Configure Amazon FSx for NetApp ONTAP for storage.
  • I. B

Answer: E

NEW QUESTION 7
A solutions architect is designing a customer-facing application for a company. The application's database will have a clearly defined access pattern throughout the year and will have a variable number of reads and writes that depend on the time of year. The company must retain audit records for the database for 7 days. The recovery point objective (RPO) must be less than 5 hours. Which solution meets these requirements?

  • A. Use Amazon DynamoDB with auto scaling Use on-demand backups and Amazon DynamoDB Streams
  • B. Use Amazon Redshif
  • C. Configure concurrency scalin
  • D. Activate audit loggin
  • E. Perform database snapshots every 4 hours.
  • F. Use Amazon RDS with Provisioned IOPS Activate the database auditing parameter Perform database snapshots every 5 hours
  • G. Use Amazon Aurora MySQL with auto scalin
  • H. Activate the database auditing parameter

Answer: B

NEW QUESTION 8
A hospital recently deployed a RESTful API with Amazon API Gateway and AWS Lambda The hospital uses API Gateway and Lambda to upload reports that are in PDF format and JPEG format The hospital needs to modify the Lambda code to identify protected health information (PHI) in the reports
Which solution will meet these requirements with the LEAST operational overhead?

  • A. Use existing Python libraries to extract the text from the reports and to identify the PHI from the extracted text.
  • B. Use Amazon Textract to extract the text from the reports Use Amazon SageMaker to identify the PHI from the extracted text.
  • C. Use Amazon Textract to extract the text from the reports Use Amazon Comprehend Medical to identify the PHI from the extracted text
  • D. Use Amazon Rekognition to extract the text from the reports Use Amazon Comprehend Medical to identify the PHI from the extracted text

Answer: C

NEW QUESTION 9
A global company hosts its web application on Amazon EC2 instances behind an Application Load Balancer (ALB). The web application has static data and dynamic data. The company stores its static data in an Amazon S3 bucket. The company wants to improve performance and reduce latency for the static data and dynamic data. The company is using its own domain name registered with Amazon Route 53.
What should a solutions architect do to meet these requirements?

  • A. Create an Amazon CloudFront distribution that has the S3 bucket and the ALB as origins Configure Route 53 to route traffic to the CloudFront distribution.
  • B. Create an Amazon CloudFront distribution that has the ALB as an origin Create an AWS Global Accelerator standard accelerator that has the S3 bucket as an endpoin
  • C. Configure Route 53 to route traffic to the CloudFront distribution.
  • D. Create an Amazon CloudFront distribution that has the S3 bucket as an origin Create an AWS Global Accelerator standard accelerator that has the ALB and the CloudFront distribution as endpoints Create a custom domain name that points to the accelerator DNS name Use the custom domain name as an endpoint for the web application.
  • E. Create an Amazon CloudFront distribution that has the ALB as an origin
  • F. Create an AWS Global Accelerator standard accelerator that has the S3 bucket as an endpoint Create two domain name
  • G. Point one domain name to the CloudFront DNS name for dynamic content, Point the other domain name to the accelerator DNS name for static content Use the domain names as endpoints for the web application.

Answer: D

NEW QUESTION 10
A company that primarily runs its application servers on premises has decided to migrate to AWS. The company wants to minimize its need to scale its Internet Small Computer Systems Interface (iSCSI) storage on premises. The company wants only its recently accessed data to remain stored locally.
Which AWS solution should the company use to meet these requirements?

  • A. Amazon S3 File Gateway
  • B. AWS Storage Gateway Tape Gateway
  • C. AWS Storage Gateway Volume Gateway stored volumes
  • D. AWS Storage Gateway Volume Gateway cachea volumes

Answer: D

NEW QUESTION 11
A company has a web-based map application that provides status information about ongoing repairs. The application sometimes has millions of users. Repair teams have a mobile app that sends current location and status in a JSON message to a REST-based endpoint.
Few repairs occur on most days. The company wants the application to be highly available and to scale when large numbers of repairs occur after nature disasters. Customer use the application most often during these times. The company does not want to pay for idle capacity.

  • A. Create a webpage that is based on Amazon S3 to display informatio
  • B. Use Amazon API Gateway and AWS Lambda to receive the JSON status data Store the JSON data m Amazon S3.
  • C. Use Amazon EC2 instances as wad servers across multiple Availability Zone
  • D. Run the EC2 instances inan Auto Scaling grou
  • E. Use Amazon API Gateway and AWS Lambda to receive the JSON status data Store the JSON data In Amazon S3.
  • F. Use Amazon EC2 instances as web servers across multiple Availability Zone
  • G. Run the EC2 instances in an Auto Scaling grou
  • H. Use a REST endpoint on the EC2 instances to receive the JSON status dat
  • I. Store the JSON data in an Amazon RDS Mufti-AZ DB instance.
  • J. Use Amazon EC? instances as web servers across multiple Availability zones Run the FC? instances in an Auto Scaling group Use a REST endpoint on the EC? instances to receive the JSON status data Store the JSON data in an Amazon DynamoDB table.

Answer: D

NEW QUESTION 12
A company has an application that provides marketing services to stores. The services are based on previous purchases by store customers. The stores upload transaction data to the company through SFTP, and the data is processed and analyzed to generate new marketing offers. Some of the files can exceed 200 GB in size.
Recently, the company discovered that some of the stores have uploaded files that contain personally identifiable information (PII) that should not have been included. The company wants administrators to be alerted if PII is shared again.
The company also wants to automate remediation.
What should a solutions architect do to meet these requirements with the LEAST development effort?

  • A. Use an Amazon S3 bucket as a secure transfer poin
  • B. Use Amazon Inspector to scan me objects in the bucke
  • C. If objects contain Pl
  • D. trigger an S3 Lifecycle policy to remove the objects that contain Pll.
  • E. Use an Amazon S3 bucket as a secure transfer poin
  • F. Use Amazon Macie to scan the objects in the bucke
  • G. If objects contain Pl
  • H. Use Amazon Simple Notification Service (Amazon SNS) to trigger a notification to the administrators to remove the objects mat contain Pll.
  • I. Implement custom scanning algorithms in an AWS Lambda functio
  • J. Trigger the function when objects are loaded into the bucke
  • K. It objects contain Rl
  • L. use Amazon Simple Notification Service (Amazon SNS) to trigger a notification to the administrators to remove the objects that contain Pll.
  • M. Implement custom scanning algorithms in an AWS Lambda functio
  • N. Trigger the function when objects are loaded into the bucke
  • O. If objects contain Pl
  • P. use Amazon Simple Email Service (Amazon STS) to trigger a notification to the administrators and trigger on S3 Lifecycle policy to remove the objects mot contain PII.

Answer: B

NEW QUESTION 13
A company needs guaranteed Amazon EC2 capacity in three specific Availability Zones in a specific AWS Region for an upcoming event that will last 1 week.
What should the company do to guarantee the EC2 capacity?

  • A. Purchase Reserved instances that specify the Region needed
  • B. Create an On Demand Capacity Reservation that specifies the Region needed
  • C. Purchase Reserved instances that specify the Region and three Availability Zones needed
  • D. Create an On-Demand Capacity Reservation that specifies the Region and three Availability Zones needed

Answer: D

Explanation:
Explanation
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-capacity-reservations.html: "When you create a Capacity Reservation, you specify:
The Availability Zone in which to reserve the capacity"

NEW QUESTION 14
A company runs multiple Windows workloads on AWS. The company’s employees use Windows the file shares that are hosted on two Amazon EC2 instances. The file shares synchronize data between themselves and maintain duplicate copies. The company wants a highly available and durable storage solution that preserves how users currently access the files.

  • A. Migrate all the data to Amazon S3 Set up IAM authentication for users to access files
  • B. Set up an Amazon S3 File Gatewa
  • C. Mount the S3 File Gateway on the existing EC2 Instances.
  • D. Extend the file share environment to Amazon FSx for Windows File Server with a Multi-AZ configuratio
  • E. Migrate all the data to FSx for Windows File Server.
  • F. Extend the file share environment to Amazon Elastic File System (Amazon EFS) with a Multi-AZ configuratio
  • G. Migrate all the data to Amazon EFS.

Answer: C

NEW QUESTION 15
A company has two applications: a sender application that sends messages with payloads to be processed and a processing application intended to receive the messages with payloads. The company wants to implement an AWS service to handle messages between the two applications. The sender application can send about 1.000 messages each hour. The messages may take up to 2 days to be processed. If the messages fail to process, they must be retained so that they do not impact the processing of any remaining messages.
Which solution meets these requirements and is the MOST operationally efficient?

  • A. Set up an Amazon EC2 instance running a Redis databas
  • B. Configure both applications to use the instanc
  • C. Store, process, and delete the messages, respectively.
  • D. Use an Amazon Kinesis data stream to receive the messages from the sender applicatio
  • E. Integrate the processing application with the Kinesis Client Library (KCL).
  • F. Integrate the sender and processor applications with an Amazon Simple Queue Service (Amazon SQS) queu
  • G. Configure a dead-letter queue to collect the messages that failed to process.
  • H. Subscribe the processing application to an Amazon Simple Notification Service (Amazon SNS) topic to receive notifications to proces
  • I. Integrate the sender application to write to the SNS topic.

Answer: C

Explanation:
Explanation
https://aws.amazon.com/blogs/compute/building-loosely-coupled-scalable-c-applications-with-amazon-sqs-and- https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-dead-letter-queues.htm

NEW QUESTION 16
A company has a stateless asynchronous application that runs in an Apache Hadoop cluster The application is invoked on demand to run extract, transform and load (ETL) jobs several limes a day
A solutions architect needs to migrate this application to the AWS Cloud by designing an Amazon EMR cluster for the workload. The cluster must be available immediately to process jobs.
Which implementation meets these requirements MOST cost-effectively?

  • A. Use zonal Reserved Instances for the master nodes and the ewe nodes Use a Spot Fleet lor tire task nodes
  • B. Use zonal Reserved Instances for the master nodes Use Spot instances for the core nodes and the task nodes
  • C. Use regional Reserved Instances for the master nodes Use a Spot Fleer for the core nodes and the task nodes
  • D. Use regional Reserved Instances for the master node
  • E. Use On-Demand Capacity Reservations for the core nodes and the task nodes.

Answer: A

NEW QUESTION 17
A company provides a Voice over Internet Protocol (VoIP) service that uses UDP connections. The service consists of Amazon EC2 instances that run in an Auto Scaling group. The company has deployments across multiple AWS Regions.
The company needs to route users to the Region with the lowest latency. The company also needs automated failover between Regions.
Which solution will meet these requirements?

  • A. Deploy a Network Load Balancer (NLB) and an associated target grou
  • B. Associate the target group with the Auto Scaling grou
  • C. Use the NLB as an AWS Global Accelerator endpoint in each Region.
  • D. Deploy an Application Load Balancer (ALB) and an associated target grou
  • E. Associate the target group with the Auto Scaling grou
  • F. Use the ALB as an AWS Global Accelerator endpoint in each Region.
  • G. Deploy a Network Load Balancer (NLB) and an associated target grou
  • H. Associate the target group with the Auto Scaling grou
  • I. Create an Amazon Route 53 latency record that points to aliases for each NL
  • J. Create an AmazonCloudFront distribution that uses the latency record as an origin.
  • K. Deploy an Application Load Balancer (ALB) and an associated target grou
  • L. Associate the target group with the Auto Scaling grou
  • M. Create an Amazon Route 53 weighted record that points to aliases for each AL
  • N. Deploy an AmazonCloudFront distribution that uses the weighted record as an origin.

Answer: C

NEW QUESTION 18
The management account has an Amazon S3 bucket that contains project reports. The company
wants to limit access to this S3 bucket to only users of accounts within the organization in AWS
Organizations.
Which solution meets these requirements with the LEAST amount of operational overhead?

  • A. Add the aws:PrincipalOrgID global condition key with a reference to the organization ID to the S3bucket policy.
  • B. Create an organizational unit (OU) for each departmen
  • C. Add the aws:PrincipalOrgPaths globalcondition key to the S3 bucket policy.
  • D. Use AWS CloudTrail to monitor the CreateAccount, InviteAccountToOrganization,LeaveOrganization, and RemoveAccountFromOrganization event
  • E. Update the S3 bucket policyaccordingly.
  • F. Tag each user that needs access to the S3 bucke
  • G. Add the aws:PrincipalTag global condition key tothe S3 bucket policy.

Answer: A

Explanation:
Explanation
https://aws.amazon.com/blogs/security/control-access-to-aws-resources-by-using-the-awsorganization-
of-iam-principals/
The aws:PrincipalOrgID global key provides an alternative to listing all the account IDs for all AWS
accounts in an organization. For example, the following Amazon S3 bucket policy allows members of
any account in the XXX organization to add an object into the examtopics bucket.
{"Version": "2020-09-10",
"Statement": {
"Sid": "AllowPutObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::examtopics/*",
"Condition": {"StringEquals":
{"aws:PrincipalOrgID":["XXX"]}}}}
https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html

NEW QUESTION 19
A company is planning on deploying a newly built application on AWS in a default VPC. The application will consist of a web layer and database layer. The web server was created in public subnets, and the MySQL database was created in private subnet. All subnets are created with the default network ACL settings, and the default security group in the VPC will be replaced with new custom security groups.

  • A. Create a database server security group with inbound and outbound rules for MySQL port 3306 traffic to and from anywhere (0.0.0.0/0).
  • B. Create a database server security group with an inbound rule for MySQL port 3300 and specify the source as a web server security group.
  • C. Create a web server security group within an inbound allow rule for HTTPS port 443 traffic from anywbere (0.0.0.0/0) and an inbound deny rule for IP range 182. 20.0.0/16
  • D. Create a web server security group with an inbound rule for HTTPS port 443 traffic from anywhere (0.0.0.0/0). Create network ACL inbound and outbound deny rules for IP range 182. 20.0.0/16
  • E. Create a web server security group with an inbound and outbound rules for HTTPS port 443 traffic to and from anywbere (0.0.0.0/0). Create a network ACL inbound deny rule for IP range 182. 20.0.0/16.

Answer: BD

NEW QUESTION 20
......

Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From DumpSolutions.com, Welcome to Download: https://www.dumpsolutions.com/SAA-C03-dumps/ (New 0 Q&As Version)