Ready to get the most out of Lightning AI? Let's upload your data to AWS S3 and get growing!

Create AWS S3 Bucket

First, we'll need to create an AWS S3 Bucket.

Follow this guide on creating an AWS S3 Bucket

Create a Python Function

Use the below code to create a Python Function. *No changes to this code is needed*

#connect to Amazon S3 Bucket
 import boto
 from boto.s3.key import Key
 from filechunkio import FileChunkIO
 #define the function to upload files to an Amazon S3 bucket
 def upload_file_to_s3(awsid, awskey, bucket, filename, remote_folder=None):
 c = boto.connect_s3(awsid, awskey)
 b = c.get_bucket(bucket)
 k = Key(b)
 remote_file_name = filename.split('/')[-1]
 if remote_folder:
 remote_file_name = remote_folder + '/' + remote_file_name
 k.key = remote_file_name

Run a SQL Query

Now, we'll run a SQL query that will gather your data and save it to your AWS S3 Bucket.

 b = c.get_bucket('BUCKET_NAME')
 #connect to DB
 conn = psycopg2.connect("dbname='{dbname}' user='{user}' host='{host}' password='{password}' port='{port}'")
 cur = conn.cursor()
 #enter your SQL query in here
 sql_query = '''write SQL here that will return desired results'''
 #run the query and capture the results
 results = cur.fetchall()
 #save the results to a CSV file
 with open('temp-file.csv', 'wb') as a:
 driftetlwriter = csv.writer(a, delimiter=",",quotechar='"', quoting=csv.QUOTE_MINIMAL, skipinitialspace = True)
 for result in results:
 #upload the file to S3

Share Data with Step One Growth

Your data is now saved to your AWS S3 Bucket. Let's share that data with Lightning AI via a custom URL.

Follow this guide on sharing an AWS S3 Bucket

Email the URL to and we'll take care of the rest!


You've now connected your AWS S3 Bucket to Lightning AI!

Did this answer your question?