From the course: Apache Airflow Essential Training
Unlock this course with a free trial
Join today to access over 24,300 courses taught by industry experts.
Setting up access to Amazon S3 buckets - Apache Airflow Tutorial
From the course: Apache Airflow Essential Training
Setting up access to Amazon S3 buckets
- [Instructor] In this demo, we'll see how you can use an S3 hook in Apache Airflow to read data from Amazon's S3 Buckets. S3 stands for Simple Storage Service which is a scalable and highly available object storage service provided by Amazon Web Services. You can see that I'm logged into my AWS account here and I'm going to use the search bar here on this management console to search for the S3 service. Click on the first option here, the first hit. This will take you to AWS's S3 page. Here I'm going to create a new bucket that's going to store the data that I'm going to be using in my Airflow pipeline. Click on Create Bucket and you can essentially give the bucket a unique name. I'm going to call it loony-credit-card-data. You can essentially accept all of the default settings that you see here for buckets. Click on Create Bucket and the bucket will have been created. The data that I'm about to upload is this credit card…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
-
-
-
(Locked)
Setting up for a PostgreSQL pipeline with hooks2m 28s
-
(Locked)
Creating and running a pipeline with PostgresSQL hooks6m 28s
-
(Locked)
Setting up access to Amazon S3 buckets4m 32s
-
(Locked)
Setting up a connection to Amazon S3 buckets2m 5s
-
(Locked)
Creating and running a pipeline with an S3 hook5m 43s
-
(Locked)
-