John Davidson

php - Handling large file uploads and sending to API

0 comments
Message:


I'm stuck wondering what the best solution is to handling large file uploads and sending them to a third-party API. Any pointers on what would be a good solution would be very welcome. Thank you in advance.


The end goal is to send video files to this API - https://docs.bunny.net/reference/manage-videos#video_uploadvideo. The complication is that the files are often large - up to 5GB in size.


I have an existing website built in PHP7 that runs on a LAMP setup on Amazon Lightsail and I want to add a feature for users to upload video files.


Currently I'm uploading the files directly to Amazon S3 using a pre-signed URL. This part is working fine.


But I need to send the files to the API mentioned above. This is where I get stuck!


I think there's two options to explore - (1) find a way to upload directly to the API and skip the S3 upload or (2) continue with uploading to S3 first and then transfer to the API. But I'm not sure if option 1 is even possible or how to do option 2!


With option 1, I'm wondering if there's a way to upload the files from the users directly to the API. If I do this using the regular HTML form upload, then the files are stored temporarily on my server before I can use cURL through PHP to transfer them to the API. This is really time consuming and feels very inefficient. But I don't know how else to send the files to the API without them first being on my server. Maybe there's an option here that I don't know about!


With option 2, I can already upload large files directly to S3 with pre-signed URLs and this process seems to run fine. But I don't know how I would then send the file from S3 to the API. I can use an S3 trigger on new files. But when I looked at Lambda, they have a tiny file size limit. Because my site is hosted on Lightsail, I noticed they have a container option. But I don't know if that can be used for this purpose and if so, how.


Basically, I'm not sure what solution is best, nor how to proceed with that. And maybe there's an option 3 that I'm not aware of!


I would welcome your input.


Many thanks in advance.



My Answer:
One possible solution could be to use a combination of both options 1 and 2. You can have users upload the files directly to S3 using pre-signed URLs as you are currently doing. Once the file is uploaded to S3, you can trigger a Lambda function that will then transfer the file from S3 to the third-party API.

In the Lambda function, you can use the AWS SDK to download the file from S3 and then use cURL or another HTTP client to send the file to the API. This way, you can take advantage of the scalability and flexibility of AWS Lambda to handle the file transfer process without having to worry about file size limits.

Alternatively, you could also look into using a service like AWS Transfer for SFTP, which allows you to set up an SFTP server on AWS that can directly transfer files to S3. You can then trigger a Lambda function to transfer the file from S3 to the API as mentioned above.

Overall, the key is to leverage the capabilities of AWS services like S3, Lambda, and Transfer for SFTP to handle the large file uploads and transfers efficiently and securely.

Rate this post

3 of 5 based on 3839 votes

Comments




© 2024 Hayatsk.info - Personal Blogs Platform. All Rights Reserved.
Create blog  |  Privacy Policy  |  Terms & Conditions  |  Contact Us