Inspirational journeys

Follow the stories of academics and their research expeditions

AWS Certified Big Data - Specialty Certification - Part 4

Mary Smith

Sun, 19 Apr 2026

AWS Certified Big Data - Specialty Certification - Part 4

1. This is a prerequisite for the provider to have access to 1:53 scoop in your account. Sold already have an account AWS. How can you get access to the seller of this bucket?

A) None
B) Create a cross-account the role of the vendor invoice and provide access to role S3 bucket.
C) Creating S3 bucket policy that allows the supplier to read from a bucket with a score of AWS.
D) Creating new users lame and provide the necessary access to the seller in a bucket
E) Create a new lame group and to provide the necessary access to the seller buckets.


2. When calculating the value in use, the CSO, which of the following options should be considered. 3 Select an answer from the options listed below?(Select 3answers)

A) The cost of EMR services.
B) The cost of basic EC2 instances
C) EBS storage costs during use.
D) Price of the underlying VPC



3. You will need the services can be used for fast and cost-effective implementation of Extract Transform Load (ETL) on large data volumes. Data source of AWS 53 and the processed results are also taken into 53. Which of the following services may be used?

A) AWS Kinesis
B) AWS
C) None
D) AWS SOS
E) AWS LOT


4. Which of the following commands can be used to transfer data from Dynamo DB redshift?

A) None
B) EXPORT
C) COPY
D) distance Cp
E) UNLOAD


5. Which option is best suited for interactive and collaborative notebook for data exploration?

A) D3
B) Hive
C) airship
D) Kinesis Analytics
E) None


1. Right Answer: C
Explanation: You share user resources in the account to another account through the creation of cross access to an account on the way. You do not need to create separate IAM users in each account In addition, users have Γƒ , ¬ UT must get out of your account and sign in to another to gain access to resources that are available in a variety of AWS accounts. After setting up the role, you will see how to use the role of the AWS Management Console, in the AWS CU. and API.

2. Right Answer: A,B,C
Explanation: AW5 documentation mentions the following Amazon EMR pridng simple and predictable: you pay per second for every second you are using a one minute minimum Amazon EMR The price is in addition to the Amazon EC2 price (the price of the underlying servers) and Amazon EBS Price (attach volume Amazon EBS).

3. Right Answer: B
Explanation: Amazon [MR is controlled cluster platform, which simplifies the operation of large data bases, such as Apache and Apache Hadoop Spark on AWS for processing and analyzing large amounts of data. With these mechanisms and projects associated with the open source. like Apache Hive and Apache Pig. You can process the data for workloads purposes of analysis and business intelligence. Add itionolfr. You can use Amazon EMP for tmnsfrirrn and move large amounts of data to and from otherAWS warehouses and databases such as Amazon Simple Storage Service (Amazon 53) Dry Amazon DyramoD8

4. Right Answer: B
Explanation: AWS documentation mentions the following complements Amazon Dynamo DB Amazon redshift with enhanced business intelligence and powerful, the SQL interface. When you copy data from a database table in Amazon Dynamo redshift. You can perform complex queries to analyze the data from these data. inducing going to other tables in the cluster Shift, Amazon Red

5. Right Answer: C
Explanation:

0 Comments

Leave a comment