Hands-On Labs

AWS Certified Data Engineer - Associate (DEA-C01)

Build real cloud skills with guided labs on AWS and Google Cloud. Practice in live environments with instant access to real cloud resources. No cloud account required.

10
Available Labs
Instant Access
Launch real cloud environments in seconds

Data Ingestion and Transformation

2 labs available

📚 Intermediate
80m

Build a Streaming Data Pipeline with Amazon Kinesis and AWS Lambda

In this lab, you'll create a real-time streaming data pipeline using Amazon Kinesis Data Streams and AWS Lambda. You'll learn how to ingest data with Kinesis, process it in real-time using Lambda, and store the processed data in Amazon S3. This pipeline can be used in various real-world applications such as monitoring application logs, IoT data streams, or financial transactions. By the end of this lab, you'll have a deeper understanding of building serverless data workflows and integrating different AWS services.

5 tasks
Implement real-time data ingestion with Amazon KinesisProcess streams using AWS Lambda functionsStore and manage processed data in Amazon S3
📚 Intermediate
75m

Ingest and Transform Streaming Data with Kinesis and AWS Glue

In this lab, students will build a streaming data pipeline using Amazon Kinesis Data Streams and AWS Glue to ingest and transform data in real-time. By leveraging Kinesis Data Streams for data ingestion and AWS Glue for data transformation, learners will develop a deeper understanding of processing streaming data at scale. This lab simulates a financial services company collecting and analyzing real-time stock market data to provide analytics dashboards to their clients. Participants will set up data streams, configure AWS Glue jobs for ETL processes, and validate data flow through the system. This will include critical tasks like setting up Kinesis producers, processing data with AWS Glue jobs, and ensuring transformed data is stored accurately in Amazon S3.

5 tasks
Configure Kinesis Data Streams for real-time ingestionImplement AWS Glue jobs for data transformationSet up S3 for data storage with encryption

Data Operations and Support

3 labs available

📚 Intermediate
90m

Automate Data Ingestion with Amazon Kinesis and AWS Glue

In this lab, you will learn how to automate data ingestion using Amazon Kinesis and AWS Glue to build a reliable and scalable data processing pipeline. You will set up a Kinesis Data Stream to capture streaming data and use AWS Glue to extract, transform, and load (ETL) data into Amazon S3 for storage. This lab will teach you how these components work together to process real-time data efficiently, enabling rapid data insights.

5 tasks
Automate data ingestion with Amazon Kinesis and AWS Glue.Data pipeline orchestration and troubleshooting.Process data efficiently using AWS services.
📚 Intermediate
110m

Implementing a Real-time Data Stream with Amazon Kinesis

In this hands-on lab, you will learn how to implement a real-time data-processing pipeline using Amazon Kinesis. You will ingest streaming data using Amazon Kinesis Data Streams, process these streams using AWS Lambda and Amazon Kinesis Data Analytics, and deliver the processed data to Amazon S3 for storage. This lab will demonstrate the practical aspects of setting up a real-time data pipeline that can handle large volumes of data in a business scenario, enabling you to respond to changes in real-time and make data-driven decisions quickly. The lab covers key concepts of data ingestion, processing, and storage in a serverless architecture. You will also learn how to ensure data quality throughout the data processing pipeline. Upon completion of this lab, you should have a strong understanding of how to manage a real-time streaming data pipeline on AWS and be prepared for associate-level AWS certification topics involving data engineering with real-time feeds.

5 tasks
Implementing real-time data pipelines using Amazon KinesisSetting up serverless data processing using AWS LambdaUsing Amazon S3 for data storage and management+1 more
📚 Intermediate
100m

Building a Data Streaming Pipeline with Amazon Kinesis and AWS Lambda

In this lab, you will build a data streaming pipeline using Amazon Kinesis Data Streams and Amazon Kinesis Data Firehose. You'll configure AWS Lambda to process streaming data and persist it into Amazon S3 for further analysis. This lab will help you understand streaming data architectures and the integration of these services to automate and facilitate data processing pipelines.

5 tasks
Configuring and managing Kinesis Data Streams and Data Firehose.Creating and deploying AWS Lambda functions for data processing.Utilizing S3 for data persistence and ensuring security through encryption and IAM policies.+1 more

Data Security and Governance

3 labs available

📚 Intermediate
95m

Secure Data Streaming with Amazon Kinesis and AWS KMS

In this lab, you will create a secure data streaming pipeline using Amazon Kinesis services. You will set up a complete data flow from data ingestion to storage, ensuring data security through encryption. The lab focuses on integrating AWS KMS for encryption and applying proper IAM policies to manage access. This hands-on experience will help reinforce your knowledge of data security in streaming architectures.

5 tasks
Managing AWS Kinesis services for data streamingUsing AWS KMS for data encryptionConfiguring IAM policies for secure access+2 more
📚 Intermediate
150m

Implementing Data Encryption Across AWS with KMS and S3

In this lab, you will learn how to ensure data encryption in transit and at rest using AWS KMS and S3. Participants will create KMS keys and use them to encrypt data stored in S3, understanding both server-side and client-side encryption options. This hands-on experience will solidify your knowledge on securing data in AWS and complying with data privacy requirements.

5 tasks
Creating and managing AWS KMS keysConfiguring S3 for server-side and client-side encryptionUsing AWS Glue for data masking+1 more
📚 Intermediate
100m

Building a Secure Data Pipeline with AWS Kinesis and S3

In this lab, you will build a secure data pipeline using Amazon Kinesis Data Streams and Amazon S3. You will integrate these services with AWS IAM and AWS KMS to ensure that your data remains secure both in transit and at rest. You will learn how to set up streaming data ingestion using Kinesis, store encrypted data in S3, and implement fine-grained access control with IAM. This lab emphasizes the practical application of AWS data security practices. By completing this lab, you will enhance your skills in configuring a secure and efficient data pipeline, crucial for handling sensitive data in real-world applications.

5 tasks
Setting up and managing Kinesis Data Streams.Implementing IAM roles and policies for secure access control.Configuring S3 buckets with server-side encryption using AWS KMS.+2 more

Data Store Management

2 labs available

📚 Intermediate
95m

Implementing a Streaming Data Processing Pipeline with Amazon Kinesis

In this lab, you'll build a real-time data processing pipeline using Amazon Kinesis services. You will configure Kinesis Data Streams to ingest streaming data, process it with AWS Lambda functions, and store the results in Amazon S3. Additionally, you'll set up AWS Glue to catalog the data and enable quick access with Amazon Athena for data analytics. This lab will give you hands-on experience with critical AWS services that are foundational for real-time analytics solutions, demonstrating how to efficiently integrate streaming and batch processes.

5 tasks
Implement real-time data ingestion with Amazon Kinesis Data StreamsConfigure serverless data processing using AWS LambdaStore and manage processed data in Amazon S3+2 more
📚 Intermediate
100m

Real-time Data Streaming with Amazon Kinesis and AWS Glue

In this lab, you will build a real-time data streaming architecture using Amazon Kinesis Data Streams and AWS Glue. You'll set up a data ingestion pipeline that captures, processes, and catalogues data to make it ready for real-time analytics. This AWS solution helps in reducing latency in data processing, ensuring that insights are available at the speed of your business needs. Construct a scenario where you integrate data from multiple sources into a single platform for consistent and streamlined analytics workflows.

5 tasks
Implementing real-time data ingestion using Amazon KinesisConfiguring AWS Glue Data Catalog and CrawlersETL job management using AWS Glue+1 more