Donate. I desperately need donations to survive due to my health

Get paid by answering surveys Click here

Click here to donate

Remote/Work from Home jobs

EFS + Python Lambda + S3

How can I implement below architecture--

5(a):- SFTP server stores the encrypted file on to the AWS EFS that is mounted on the SFTP server 6(a):- file is available on AWS cloud from the mounted EFS.

7(a):-Data transfer to S3 Bucket.

File Decryption

  • The File decryption lambda function is executed when a new file lands to the EFS. This can use Cloudwatch events when the final trigger file is copied.
  • The function picks up the encrypted file, decrypts them, adds the required metadata and sends them to the S3 bucket.
  • Deletes the encrypted(gig format) and unencrypted file once sent to Caspian.
  • The function also maintains the lifecycle of the files on the EFS.
  • This can also be implemented as subscriber and worker lambda function pattern.

My Architecture

My question:-

  • I need to write a Lambda python function which will do 7(a) activities.

My startup basic code:-

import boto3
import gnupg

# Create an S3 client
s3 = boto3.client('s3')

def lambda_handler(event, context):
    Key = "/efs/iamfile.txt"
    stream = open(Key, "rb")
    decrypted_data = gpg.decrypt_file(stream)
    bucketName = "op-efs-vpc"
    outPutname = "decrypted_data"
    s3 = boto3.client('s3')
    s3.upload_file(Key,bucketName,outPutname)

Comments