Inspirational journeys

Follow the stories of academics and their research expeditions

AWS Certified Big Data - Specialty Certification - Part 15

Mary Smith

Sun, 19 Apr 2026

AWS Certified Big Data - Specialty Certification - Part 15

1. Which of the following is a self-service tool that can be used to build applications in real-time using streaming?

A) kinesis
B) Kinesis Firehose
C) SQS
D) None
E) Kafka


2. Which of the following can be used to log file generated from the CSO cluster? Please, choose:

A) Amazon 53
B) None
C) Amazon Cloud track
D) amazon Glacier
E) Amazon Dynamo DB


3. There is a requirement to transfer 3 TB of data in AWS. There is a time limit for the data transfer and the limit just one line at 1OOMBit AWS cloud. What is the best solution to be used for transferring data to the cloud?

A) Amazon Import / Export
B) None
C) Amazon Storage Gateway
D) Amazon Direct Connect
E) Amazon 53


4. Which of the following services can be used to automate the movement and transformation of data in AWS? Please, choose:

A) Pipeline data AWS
B) AWS redshift
C) None
D) WS Elastic Search
E) AMC Dynamo DB


5. You need a cost-effective solution for storing large collections of video files and has a data warehouse that can monitor the data. Which of the following would be the solution needed to meet requirements? Choose two of the options listed below. Each answer is part of the decision to choose:

A) Keep starting position for video in the AWS redshift
B) Keep a collection of video files to the AWS redshift
C) None
D) Keep starting position for video in the AWS Kinesis
E) Keep a collection of video files in S3


1. Right Answer: E
Explanation: Apache Kafka is an open source, distributed messaging system, which allows you to create applications in real-time using streaming. You can send the streaming data on the home page, click streams. financial operations and application logs to your Kafka duster and data buffer and make it flow process applications based on the structure, which induces the Streaming Spark Apache, Apache Storm.

2. Right Answer: A
Explanation: You can use Amazon EMR management interfaces, and log files for troubleshooting rags problems such as wrong - or right. Amazon EMR allows archiving files enter Amazon 53, so you can keep logs and troubleshoot problems even after the collision is over.

3. Right Answer: A
Explanation:

4. Right Answer: A
Explanation: AWS Pipeline Data is a web service that you can use to automate the movement and transformation of data. With AWS Pipeline data, you can define workflows, data-driven. so that the data can be dependent on the successful completion of previous assignments. You define the parameters of your data transformations and data AWS Pipeline performs the logic that you have created.

5. Right Answer: E
Explanation:

0 Comments

Leave a comment