Inspirational journeys

Follow the stories of academics and their research expeditions

AWS Certified Big Data - Specialty Certification - Part 24

Mary Smith

Sun, 19 Apr 2026

AWS Certified Big Data - Specialty Certification - Part 24

1. The company produces new functions with high frequency, requiring high availability applications. As PC applications testing logs of each updated Amazon EC2 instance of the application should be analyzed in real time to ensure that the program works flawlessly after each deployment. Which of these methods should be used for sending and analyzing logs in a very accessible way? Please, choose:

A) Logbooks on the Amazon Cloud Watch magazines and use Amazon EMR to analyze logs in batch mode ¯EL hours
B) Logbooks to power and Amazon Kinesis users to log analysis in a lively
C) None
D) Transportation of logs in the Amazon S3 for the sustainable development and use of Amazon Red shift analyze logs in batch mode, every hour
E) Ship large instance Amazon EC2 logs and analyze logs in the living way


2. Which of the following ca n be used to move very large volumes of data from AWS to 100 PB per unit?

A) AWS snowmobiles
B) AWS Import / Export
C) None
D) AWS 53 Export
E) AWS Snowball


3. You can effectively add new data to an existing table by using a combination of updates and inserts the table. Even Amazon redshift does not support any merger or upsert. Command to update the table from one data source, the merge operation can be performed by creating an intermediate storage table, and then using one of the methods described in this section to update the destination table from the staging table. Which of the following can be done to ensure proper compression settings that are used for red table offset, if the compression parameters are entered manually?

A) Using Cloud view logs, to see how this data is available
B) Using cloud view metrics to see how this data is available
C) Use the compression command rank
D) Use the compression command Analysis
E) None


4. To save the recording power directly to the storage services such as Amazon 53 Amazon Red Shift, or searching Amazon's Elastic, you can use the power of Kinesis data fire hose supply instead of creating consumer software Which of the following is not a factor in performance for database migration data using AWS DB migration service?

A) VPC properties Replication Server
B) The number of elements to be transferred
C) None
D) Replication Server Resource capacity
E) The presence of a source of resources


5. Your company is planning to create a program that would use Amazon Kinesis. Power lines will be directly stored in S3. Which of the following is the recommended design approach to this requirement?

A) Using Kinesis Firehose data feed message flow power in S3.
B) Create DB Dynamo power to store records in S3
C) Use event trigger in 53 to retain the current directory
D) Creating a consumer application with KCL library for storing records
E) None


1. Right Answer: B
Explanation: Amazon Kinesis makes it easy for Collect. process. and analyze streaming data in real time, so you and react quickly to new information. Amazon Kinesis offers an important opportunity for costeffectively process stream at any scale, along with the flexibility to choose the tool that best suits the requirements of your application. With Amazon Kinesis you can eat the real time data, such as video, audio, application logs, website viewing route.

2. Right Answer: A
Explanation: AMC Snowmobiling is Exabyte data service scale used to move very large volumes of data on AWS. You can transfer up to 100PB on snowmobiles, 45 feet long, reliable transportation of containers, semi-trailers drawn by t1L�, � ± Use the Spot instances CSO cluster to save on costs

3. Right Answer: D
Explanation: If you decide to use a compression coding by hand, you can run the command ANALYAZE pressure on the already crowded table and use the results to select the compression encoding. Compression is the operation at the column level, thereby reducing the data size for recording. Compression allows to save space during storage and reduces the size of data read from the storage, which reduces the amount of disk I / O operations, and hence improves the performance of queries.

4. Right Answer: A
Explanation: A number of factors affect the performance of AWS DMS migration: the presence of a source of resources (A) an available network bandwidth resources of the target power Replication Server to take to change the type and distribution of raw data

5. Right Answer: A
Explanation:

0 Comments

Leave a comment