AWS-Certified-Big-Data-Specialty Guide

Avant-garde AWS-Certified-Big-Data-Specialty Bundle 2021

Proper study guides for Abreast of the times Amazon Amazon AWS Certified Big Data - Speciality certified begins with Amazon AWS-Certified-Big-Data-Specialty preparation products which designed to deliver the Certified AWS-Certified-Big-Data-Specialty questions by making you pass the AWS-Certified-Big-Data-Specialty test at your first time. Try the free AWS-Certified-Big-Data-Specialty demo right now.

Free AWS-Certified-Big-Data-Specialty Demo Online For Amazon Certifitcation:

NEW QUESTION 1
What is one key difference between an Amazon EBS-backed and an instance-store backed instance?

  • A. Amazon EBS-backed instances can be stopped and restarted
  • B. Instance-store backed instances can be stopped and restarted
  • C. Auto scaling requires using Amazon EBS-backed instances
  • D. Virtual Private Cloud requires EBS backed instances

Answer: A

NEW QUESTION 2
A customers needs to capture all client connection information from their load balancer every five
minutes. The company wants to use data for analyzing traffic patterns and troubleshooting their applications. Which of the following options meets the customer requirements?

  • A. Enable access logs on the load balancer
  • B. Enable AWS CloudTrail for the load balancer
  • C. Enable Amazon CloudWatch metrics on the load balancer
  • D. Install the Amazon CloudWatch Logs agent on the load balancer

Answer: B

NEW QUESTION 3
A user is running one instance for only 3 hours every day. The user wants to save some cost with the
instance. Which of the below mentioned Reserved Instance categories is advised in this case?

  • A. The user should not use RI; instead only go with the on-demand pricing
  • B. The user should use the AWS high utilized RI
  • C. The user should use the AWS medium utilized RI
  • D. The user should use the AWS low utilized RI

Answer: A

NEW QUESTION 4
A company operates an international business served from a single AWS region. The company wants to expand into a new country. The regulator for that country requires the Data Architect to maintain a log of financial transactions in the country within 24 hours of production transaction. The production application is latency insensitive. The new country contains another AWS region.
What is the most cost-effective way to meet this requirement?

  • A. Use CloudFormation to replicate the production application to the new region
  • B. Use Amazon CloudFront to serve application content locally in the country; Amazon CloudFront logs will satisfy the requirement
  • C. Continue to serve customers from the existing region while using Amazon Kinesis to stream transaction data to the regulator
  • D. Use Amazon S3 cross-region replication to copy and persist production transaction logs to a budget the new country’s region

Answer: D

NEW QUESTION 5
A customer has a machine learning workflow that consist of multiple quick cycles of reads-writes-
reads on Amazon S3. The customer needs to run the workflow on EMR but is concerned that the reads in subsequent cycles will miss new data critical to the machine learning from the prior cycles.
How should the customer accomplish this?

  • A. Turn on EMRFS consistent view when configuring the EMR cluster
  • B. Use AWS Data Pipeline to orchestrate the data processing cycles
  • C. Set Hadoop.data.consistency = true in the core-site.xml file
  • D. Set Hadoop.s3.consistency = true in the core-site.xml file

Answer: B

NEW QUESTION 6
You need to configure an Amazon S3 bucket to serve static assets for your public-facing web application. Which methods ensure that all objects uploaded to the bucket are set to public read? Choose 2 answers

  • A. Set permissions on the object to public read during upload
  • B. Configure the bucket ACL to sell all objects to public read
  • C. Configure the bucket policy to set all objects to public read
  • D. Use AWS identity and access Management roles to set the bucket to public read
  • E. Amazon S3 objects default to public read, so no action is needed

Answer: BC

NEW QUESTION 7
A company needs a churn prevention model to predict which customers will NOT review their yearly
subscription to the company’s service. The company plans to provide these customers with a promotional offer. A binary classification model that uses Amazon Machine Learning is required. On which basis should this binary classification model be built?

  • A. User profiles (age, gender, income, occupation)
  • B. Last user session
  • C. Each user time series events in the past 3 months
  • D. Quarterly results

Answer: C

NEW QUESTION 8
A company uses Amazon Redshift for its enterprise data warehouse. A new op-premises PostgreSQL
OLTP DB must be integrated into the data warehouse. Each table in the PostgreSQL DB has an indexed last_modified timestamp column. The data warehouse has a staging layer to load source data into the data warehouse environment for further processing.
The data log between the source PostgreSQL DB and the Amazon Redshift staging layer should NOT exceed four hours.
What is the most efficient technique to meet these requirements?

  • A. Create a DBLINK on the source DB to connect to Amazon Redshif
  • B. Use a PostgreSQL trigger on the source table to capture the new insert/update/delete event and execute the event on the Amazon Redshift staging table.
  • C. Use a PostgreSQL trigger on the source table to capture the new insert/update/delete event and write it to Amazon Kinesis Stream
  • D. Use a KCL application to execute the event on the Amazon Redshift staging table.
  • E. Extract the incremental changes periodically using a SQL quer
  • F. Upload the changes to multiple Amazon Simple Storage Service (S3) objects and run the COPY command to load the Amazon Redshift staging table.
  • G. Extract the incremental changes periodically using a SQL quer
  • H. Upload the changes to a single Amazon Simple Storage Service (S3) object run the COPY command to load to the Amazon Redshift staging layer.

Answer: C

NEW QUESTION 9
A solutions architect works for a company that has a data lake based on a central Amazon S3 bucket.
The data contains sensitive information. The architect must be able to specify exactly which files each user can access. Users access the platform through SAML federation Single Sign On platform.
The architect needs to build a solution that allows fine grained access control, traceability of access to the objects, and usage of the standard tools (AWS Console, AWS CLI) to access the data.
Which solution should the architect build?

  • A. Use Amazon S3 Server-Side Encryption with AWS KMS-Managed Keys for strong data.Use AWS KMS to allow access to specific elements of the platfor
  • B. Use AWS CloudTrail for auditing
  • C. Use Amazon S3 Server-Side Encryption with Amazon S3 Managed Key
  • D. Set Amazon S3ACI to allow access to specific elements of the platfor
  • E. Use Amazon S3 to access logs for auditing
  • F. Use Amazon S3 Client-Side Encryption with Client-Side Master Ke
  • G. Set Amazon S3 ACI to allow access to specific elements of the platfor
  • H. Use Amazon S3 access logs for auditing
  • I. Use Amazon S3 Client-Side Encryption with AWS KMS-Managed keys for storing data.Use AMS KWS to allow access to specific elements of the platfor
  • J. Use AWS CloudTrail for auditing

Answer: B

NEW QUESTION 10
A company with a support organization needs support engineers to be able to search historic cases to
provide fast responses on new issues raised. The company has forwarded all support messages into an Amazon Kinesis Stream. This meets a company objective of using only managed services to reduce.
The company needs an appropriate architecture that allows support engineers to search on historic cases can find similar issues and their associated responses.
Which AWS Lambda action is most appropriate?

  • A. Ingest and index the content into an Amazon Elasticsearch domain
  • B. Stem and tokenize the input and store the results into Amazon ElastiCache
  • C. Write data as JSON into Amazon DynamoDB with primary and secondary indexes
  • D. Aggregate feedback is Amazon S3 using a columnar format with partitioning

Answer: A

NEW QUESTION 11
A company is building a new application is AWS. The architect needs to design a system to collect application log events. The design should be a repeatable pattern that minimizes data loss if an application instance fails, and keeps a durable copy of all log data for at least 30 days.
What is the simplest architecture that will allow the architect to analyze the logs?

  • A. Write them directly to a Kinesis Firehos
  • B. Configure Kinesis Firehose to load the events into an Amazon Redshift cluster for analysis.
  • C. Write them to a file on Amazon Simple Storage Service (S3). Write an AWS lambda function that runs in response to the S3 events to load the events into Amazon Elasticsearch service for analysis.
  • D. Write them to the local disk and configure the Amazon cloud watch Logs agent to lead the data into CloudWatch Logs and subsequently into Amazon Elasticsearch Service.
  • E. Write them to CloudWatch Logs and use an AWS Lambda function to load them into HDFS on an Amazon Elastic MapReduce (EMR) cluster for analysis.

Answer: A

NEW QUESTION 12
An administrator needs to manage a large catalog of items from various external sellers. The administration needs to determine if the items should be identified as minimally dangerous, dangerous or highly dangerous based on their textual description. The administrator already has some items with the danger attribute, but receives hundreds of new item descriptions every day without such classification.
The administrator has a system that captures dangerous goods reports from customer support team or from user feedback. What is a cost –effective architecture to solve this issue?

  • A. Build a set of regular expression rules that are based on the existing example
  • B. And run them on the DynamoDB streams as every new item description is added to the system.
  • C. Build a kinesis Streams process that captures and marks the relevant items in the dangerous goods reports using a Lambda function once more than two reports have been filed.
  • D. Build a machine learning model to properly classify dangerous goods and run it on the DynamoDB streams as every new item description is added to the system.
  • E. Build a machine learning model with binary classification for dangerous goods and run it on the DynamoDB streams as every new item description is added to the system.

Answer: C

NEW QUESTION 13
You are working with customer who has 10 TB of archival data that they want to migrate to Amazon Glacier. The customer has a 1Mbps connection to the Internet. Which service or feature provide the fastest method of getting the data into Amazon Glacier?

  • A. Amazon Glacier multipart upload
  • B. AWS Storage Gateway
  • C. VM Import/Export
  • D. AWS Import/Export

Answer: D

NEW QUESTION 14
You have an Auto Scaling group associated with an Elastic Load Balancer (ELB). You have noticed that
instances launched via the Auto Scaling group are being marked unhealthy due to an ELB health check, but these unhealthy instances are not being terminated.
What do you need to do to ensure trial instances marked unhealthy by the ELB will be terminated
and replaced?

  • A. Change the thresholds set on the Auto Scaling group health check
  • B. Add an Elastic Load Balancing health check to your Auto Scaling group
  • C. Increase the value for the Health check interval set on the Elastic Load Balancer
  • D. Change the health check set on the Elastic Load Balancer to use TCP rather than HTTP checks

Answer: B

NEW QUESTION 15
A user has setup an RDS DB with Oracle. The user wants to get notifications when someone modifies
the security group of that DB. How can the user configure that?

  • A. It is not possible to get the notifications on a change in the security group
  • B. Configure SNS to monitor security group changes
  • C. Configure event notification on the DB security group
  • D. Configure the CloudWatch alarm on the DB for a change in the security group

Answer: C

NEW QUESTION 16
Which of the following instance types are available as Amazon EBS backend only?

  • A. General purpose T2
  • B. General purpose M3
  • C. Compute-optimized C4
  • D. Compute-optimized C3
  • E. Storage-optimized 12

Answer: AC

NEW QUESTION 17
A company needs to monitor the read and write IOPs metrics for their AWS MySQL RDS instances
and send real-time alerts to their operations team. Which AWS services can accomplish this? Choose 2 answers

  • A. Amazon Simple Email Service
  • B. Amazon CloudWatch
  • C. Amazon Simple Queue Service
  • D. Amazon Route 53
  • E. Amazon Simple Notification Service

Answer: BE

NEW QUESTION 18
You run a small online consignment marketplace. Interested sellers complete an online application in
order to allow them to sell their products on your website. Once approved, they can their product using a custom interface. From that point, you manage the shopping cart process so that when a buyer decides to buy a product, you handle the billing and coordination the shipping. Part of this process requires sending emails to the buyer and the seller at different stages. Your system has been running on AWS for a few months. Occasionally, products are shipped before payment has cleared and emails are sent out of order. Furthermore, sometimes credit cards are being charged twice.
How can you resolve these problems?

  • A. Use the Amazon Simple Queue Service (SQS), and use a different set of workers for each task
  • B. Use the Amazon Simple Workflow Service (SWF), and use a different set of workers for each task.
  • C. Use the Simple Email Service (SES) to control the correct order of email delivery
  • D. Use the AWS Data Pipeline service to control the process flow of the various tasks
  • E. Use the Amazon Simple Queue Service (SQS), and use a single set of workers for each task

Answer: E

NEW QUESTION 19
An organization uses a custom map reduce application to build monthly reports based on many small
data files in an Amazon S3 bucket. The data is submitted from various business units on a frequent
but unpredictable schedule. As the dataset continues to grow, it becomes increasingly difficult to process all of the data in one day. The organization has scaled up its Amazon EMR cluster, but other optimizations could improve performance.
The organization needs to improve performance minimal changes to existing processes and applications.
What action should the organization take?

  • A. Use Amazon S3 Event Notifications and AWS Lambda to create a quick search file index in DynamoDB.
  • B. Add Spark to the Amazon EMR cluster and utilize Resilient Distributed Datasets in-memory.
  • C. Use Amazon S3 Event Notifications and AWS Lambda to index each file into an Amazon Elasticsearch Service cluster.
  • D. Schedule a daily AWS Data Pipeline process that aggregates content into larger files using S3DistCp.
  • E. Have business units submit data via Amazon Kinesis Firehose to aggregate data hourly into Amazon S3.

Answer: A

NEW QUESTION 20
A customer has an Amazon S3 bucket. Objects are uploaded simultaneously by a cluster of servers from multiple streams of data. The customer maintains a catalog of objects uploaded in Amazon S3 using an Amazon DynamoDB table. This catalog has the following fields StreamName, TimeStamp, and ServerName, TimeStamp, and ServerName, from which ObjectName can be obtained.
The customer needs to define the catalog to support querying for a given stream or server within a defined time range.
Which DynamoDB table scheme is most efficient to support these queries?

  • A. Define a Primary Key with ServerName as Partition Key and TimeStamp as Sort Ke
  • B. Don NOT define a Secondary Index or Global Secondary Index.
  • C. Define a Primary Key with StreamName as Partition Key and TimeStamp followed by ServerName as Sort Ke
  • D. Define a Global Secondary Index with ServerName as Partition Key and TimeStamp followed by StreamName.
  • E. Define a Primary Key with ServerName as Partition Ke
  • F. Define a Local Secondary Index with StreamName as Partition Ke
  • G. Define a Global Secondary Index with TimeStamp as Partition Key.
  • H. Define a Primary Key with ServerName as Partition Ke
  • I. Define a Local Secondary Index with TimeStamp as Partition Ke
  • J. Define a Global Secondary Index with StreamName as Partition key and TimeStamp as Sort Key.

Answer: A

NEW QUESTION 21
A company generates a large number of files each month and needs to use AWS import/export to
move these files into Amazon S3 storage. To satisfy the auditors, the company needs to keep a record of which files were imported into Amazon S3.
What is a low-cost way to create a unique log for each import job?

  • A. Use the same log file prefix in the import/export manifest files to create a versioned log file in Amazon S3 for all imports
  • B. Use the log file prefix in the import/export manifest file to create a unique log file in Amazon S3 for each import
  • C. Use the log file checksum in the import/export manifest file to create a log file in Amazon S3 for each import
  • D. Use script to iterate over files in Amazon S3 to generate a log after each import/export job

Answer: B

NEW QUESTION 22
When an EC2 instance that is backed by an s3-based AMI is terminated. What happens to the data on the root volume?

  • A. Data is unavailable until the instance is restarted
  • B. Data is automatically deleted
  • C. Data is automatically saved as an EBS snapshot
  • D. Data is automatically saved as an EBS volume

Answer: B

NEW QUESTION 23
A user has provisioned 2000 IOPS to the EBS volume. The application hosted on that EBS is experiencing less IOPS than provisioned. Which of the below mentioned options does not affect the IOPS of the volume?

  • A. The application does not have enough IO for the volume
  • B. The instance is EBS optimized
  • C. The EC2 instance has 10 Gigabit Network connectivity
  • D. The volume size is too large

Answer: D

NEW QUESTION 24
A company needs to deploy virtual desktops to its customers in a virtual private cloud, leveraging
existing security controls. Which set of AWS services and features will meet the company’s requirements?

  • A. Virtual private network connection, AWS Directory services, and ClassicLink
  • B. Virtual private network connection, AWS Directory services, and Amazon WorkSpaces
  • C. AWS Directory service, Amazon WorkSpaces, and AWS Identity and Access Management
  • D. Amazon Elastic Compute Cloud, and AWS identity and access management

Answer: B

NEW QUESTION 25
Your application uses CloudFormation to orchestrate your application’s resources. During your testing phase before application went live, your Amazon RDS instance type was changed and caused the instance to be re-created, resulting in the loss of test data.
How should you prevent this from occurring in the future?

  • A. Within the AWS CloudFormation parameter with which users can select the Amazon RDS instance type, set AllowedValues to only contain the current instance type
  • B. Use an AWS CloudFormation stack policy to deny updates to the instanc
  • C. Only allow UpdateStack permission to IAM principles that are denied SetStackPolicy
  • D. In the AWS CloudFormation template, set the AWS::RDS::DBInstance’s DBInstanceClass property to be read-only
  • E. Subscribe to the AWS CloudFormation notification “BeforeResourceUpdate” and call CancelStackUpdate if the resource identified is the Amazon RDS instance
  • F. In the AWS ClousFormation template, set the DeletionPolicy of the AWS::RDS::DBInstance’s DeletionPolicy property to “Retain”

Answer: E

NEW QUESTION 26
......

Recommend!! Get the Full AWS-Certified-Big-Data-Specialty dumps in VCE and PDF From Simply pass, Welcome to Download: https://www.simply-pass.com/Amazon-exam/AWS-Certified-Big-Data-Specialty-dumps.html (New 243 Q&As Version)


To know more about the AWS-Certified-Big-Data-Specialty, click here.

Tagged as : Amazon AWS-Certified-Big-Data-Specialty Dumps, Download AWS-Certified-Big-Data-Specialty pdf, AWS-Certified-Big-Data-Specialty VCE, AWS-Certified-Big-Data-Specialty pass4sure, examcollection AWS-Certified-Big-Data-Specialty