AWS Certified Data Engineer - Associate (DEA-C01)
Build real cloud skills with guided labs on AWS and Google Cloud. Practice in live environments with instant access to real cloud resources. No cloud account required.
2 labs available
In this lab, you'll create a real-time streaming data pipeline using Amazon Kinesis Data Streams and AWS Lambda. You'll learn how to ingest data with Kinesis, process it in real-time using Lambda, and store the processed data in Amazon S3. This pipeline can be used in various real-world applications such as monitoring application logs, IoT data streams, or financial transactions. By the end of this lab, you'll have a deeper understanding of building serverless data workflows and integrating different AWS services.
In this lab, students will build a streaming data pipeline using Amazon Kinesis Data Streams and AWS Glue to ingest and transform data in real-time. By leveraging Kinesis Data Streams for data ingestion and AWS Glue for data transformation, learners will develop a deeper understanding of processing streaming data at scale. This lab simulates a financial services company collecting and analyzing real-time stock market data to provide analytics dashboards to their clients. Participants will set up data streams, configure AWS Glue jobs for ETL processes, and validate data flow through the system. This will include critical tasks like setting up Kinesis producers, processing data with AWS Glue jobs, and ensuring transformed data is stored accurately in Amazon S3.
3 labs available
In this lab, you will learn how to automate data ingestion using Amazon Kinesis and AWS Glue to build a reliable and scalable data processing pipeline. You will set up a Kinesis Data Stream to capture streaming data and use AWS Glue to extract, transform, and load (ETL) data into Amazon S3 for storage. This lab will teach you how these components work together to process real-time data efficiently, enabling rapid data insights.
In this hands-on lab, you will learn how to implement a real-time data-processing pipeline using Amazon Kinesis. You will ingest streaming data using Amazon Kinesis Data Streams, process these streams using AWS Lambda and Amazon Kinesis Data Analytics, and deliver the processed data to Amazon S3 for storage. This lab will demonstrate the practical aspects of setting up a real-time data pipeline that can handle large volumes of data in a business scenario, enabling you to respond to changes in real-time and make data-driven decisions quickly. The lab covers key concepts of data ingestion, processing, and storage in a serverless architecture. You will also learn how to ensure data quality throughout the data processing pipeline. Upon completion of this lab, you should have a strong understanding of how to manage a real-time streaming data pipeline on AWS and be prepared for associate-level AWS certification topics involving data engineering with real-time feeds.
In this lab, you will build a data streaming pipeline using Amazon Kinesis Data Streams and Amazon Kinesis Data Firehose. You'll configure AWS Lambda to process streaming data and persist it into Amazon S3 for further analysis. This lab will help you understand streaming data architectures and the integration of these services to automate and facilitate data processing pipelines.
3 labs available
In this lab, you will create a secure data streaming pipeline using Amazon Kinesis services. You will set up a complete data flow from data ingestion to storage, ensuring data security through encryption. The lab focuses on integrating AWS KMS for encryption and applying proper IAM policies to manage access. This hands-on experience will help reinforce your knowledge of data security in streaming architectures.
In this lab, you will learn how to ensure data encryption in transit and at rest using AWS KMS and S3. Participants will create KMS keys and use them to encrypt data stored in S3, understanding both server-side and client-side encryption options. This hands-on experience will solidify your knowledge on securing data in AWS and complying with data privacy requirements.
In this lab, you will build a secure data pipeline using Amazon Kinesis Data Streams and Amazon S3. You will integrate these services with AWS IAM and AWS KMS to ensure that your data remains secure both in transit and at rest. You will learn how to set up streaming data ingestion using Kinesis, store encrypted data in S3, and implement fine-grained access control with IAM. This lab emphasizes the practical application of AWS data security practices. By completing this lab, you will enhance your skills in configuring a secure and efficient data pipeline, crucial for handling sensitive data in real-world applications.
2 labs available
In this lab, you'll build a real-time data processing pipeline using Amazon Kinesis services. You will configure Kinesis Data Streams to ingest streaming data, process it with AWS Lambda functions, and store the results in Amazon S3. Additionally, you'll set up AWS Glue to catalog the data and enable quick access with Amazon Athena for data analytics. This lab will give you hands-on experience with critical AWS services that are foundational for real-time analytics solutions, demonstrating how to efficiently integrate streaming and batch processes.
In this lab, you will build a real-time data streaming architecture using Amazon Kinesis Data Streams and AWS Glue. You'll set up a data ingestion pipeline that captures, processes, and catalogues data to make it ready for real-time analytics. This AWS solution helps in reducing latency in data processing, ensuring that insights are available at the speed of your business needs. Construct a scenario where you integrate data from multiple sources into a single platform for consistent and streamlined analytics workflows.