I know you can't run PHP on Amazon S3, but is there a way to host a static site on S3 and then have a contact form that runs using PHP from another server? In essence, I'm asking if the PHP file needs to be on S3 for it to run properly?
No, the PHP file does not have to be on S3. Just put it on the other server and refer to the absolute (full) URL.
Pages can submit forms to other domains (e.g. to the PHP server). This is actually what makes CSRF possible even when a server requires POST requests.
Pages can load images, scripts, and stylesheets from other domains. In fact, it is very common for these assets to be loaded from separate servers or even third-party content delivery networks. For example, Stack Overflow loads its copy of jQuery from //ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js.
Related
I'm stuck wondering what the best solution is to handling large file uploads and sending them to a third-party API. Any pointers on what would be a good solution would be very welcome. Thank you in advance.
The end goal is to send video files to this API - https://docs.bunny.net/reference/manage-videos#video_uploadvideo. The complication is that the files are often large - up to 5GB in size.
I have an existing website built in PHP7 that runs on a LAMP setup on Amazon Lightsail and I want to add a feature for users to upload video files.
Currently I'm uploading the files directly to Amazon S3 using a pre-signed URL. This part is working fine.
But I need to send the files to the API mentioned above. This is where I get stuck!
I think there's two options to explore - (1) find a way to upload directly to the API and skip the S3 upload or (2) continue with uploading to S3 first and then transfer to the API. But I'm not sure if option 1 is even possible or how to do option 2!
With option 1, I'm wondering if there's a way to upload the files from the users directly to the API. If I do this using the regular HTML form upload, then the files are stored temporarily on my server before I can use cURL through PHP to transfer them to the API. This is really time consuming and feels very inefficient. But I don't know how else to send the files to the API without them first being on my server. Maybe there's an option here that I don't know about!
With option 2, I can already upload large files directly to S3 with pre-signed URLs and this process seems to run fine. But I don't know how I would then send the file from S3 to the API. I can use an S3 trigger on new files. But when I looked at Lambda, they have a tiny file size limit. Because my site is hosted on Lightsail, I noticed they have a container option. But I don't know if that can be used for this purpose and if so, how.
Basically, I'm not sure what solution is best, nor how to proceed with that. And maybe there's an option 3 that I'm not aware of!
I would welcome your input.
Many thanks in advance.
I'm trying to load PHP files such as index.php using AWS CloudFront.
The documentation states -
Create a web distribution if you want to:
Speed up distribution of static and dynamic content, for example,
.html, .css, .php, and graphics files. Distribute media files using
HTTP or HTTPS. Add, update, or delete objects, and submit data from
web forms. Use live streaming to stream an event in real time.
However, when I upload PHP files to the relative CloudFront bucket it ends up downloading the file and opening it. How will allow me to host PHP files?
However, when I upload PHP files to the relative CloudFront bucket
There is no such thing as a CloudFront bucket, so you are likely referring to an S3 bucket, configured behind CloudFront as an origin.
CloudFront works with dynamic content, such as might be generated with PHP, but the PHP site needs to be hosted on a server that supports it -- not S3.
You can host a static website on Amazon Simple Storage Service (Amazon S3). On a static website, individual webpages include static content. They might also contain client-side scripts. By contrast, a dynamic website relies on server-side processing, including server-side scripts such as PHP, JSP, or ASP.NET. Amazon S3 does not support server-side scripting. (emphasis added)
https://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteHosting.html
See AWS Web Site Solutions for options for hosting sites involving static or dynamic content, bearing in mind that PHP requires a solution supporting server-side scripting and dynamic content, so not all solutions presented there (including S3) will fit your needs... but these are all compatible with CloudFront -- which is only tasked with delivering the rendered content, not the original rendering.
CloudFront is designed to serve content to end users and not execute your code. Your PHP files would be on an EC2 instance running PHP and a webserver (Apache, Nginx) which you could then put behind CloudFront to get the benefits. This would then generate the HTML for CloudFront to serve. CloudFront itself does not handle the processing and just deals with the static HTML. When using CloudFront with S3 it will serve up the content directly to the end user.
I am not quite sure where you found that snippet but the introduction does not seem to list .php for me.
https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/Introduction.html
Amazon CloudFront is a web service that speeds up distribution of your
static and dynamic web content, such as .html, .css, .js, and image
files, to your users.
I was wondering if it's possible to manipulate or change the names of files upon deploying the distribution.
Reason for during this, is that we don't have the actual files on our own servers but is provided by a partner. Is it somehow possible to run a php function upon deploy to change the name of the file on the cdn ?
So eg.
partner.example.com/image/123120913.jpg
to
1234.cloudfront.com/image/SHOE-NAME.jpg
One way is to import all images first to local storage and upon that download change filename - but seems very extensive.
As we can provide the image name easy if it's possible to run a php function upon deploying.
Amazon CloudFront is a caching service that retrieves content from a specified origin (eg web server, Amazon S3), stores it in a cache and then serves it to users.
Amazon CloudFront does not create aliases to filenames. It simply passes the request to the origin. If the origin is a web server, you could write a web app that returns any type of information given the request URL, but CloudFront cannot rename or map filenames.
I am currently writing an application using the Yii-framework in PHP that stores a large number of files uploaded by the users of the application. I decided that since the number of files are going to be ever increasing, it would be beneficial to use Amazon S3 to store these files and when requested, the server could retrieve the files and send it to the user. (The server is an EC2 instance in the same zone)
Since the files are all confidential, the server has to verify the identity of the user and their credentials before allowing the user to receive the file. Is there a way to send the file to the user in this case directly from S3 or do I have to pull the data to the server first and then serve it to the user.
If So, Is there any way to cache the recently uploaded files on the local server so that it does not have to go to s3 to look for the file. In most cases, the most recently uploaded files will be requested repeatedly by multiple clients.
Any help would be greatly appreciated!
Authenticated clients can download files directly from S3 by signing the appropriate URLs on the server prior to displaying the page/urls to the client.
For more information, see: http://s3.amazonaws.com/doc/s3-developer-guide/RESTAuthentication.html
Note that for confidential files you may also want to consider server-side/client side encryption. Finally, for static files ( such as images ) you may want to set the appropriate cache headers as well.
Use AWS Cloud Front to server these static files. Rather than sending the files to the user, send them links to the files. The Links need to be cloud front links & not direct links to the S3 bucket.
This has the benefit of keeping load low on your server as well as caching files close to your users for better performance.
More details here Serving Private Content through CloudFront
As far as I have read, PHP can only get the file listing from local server on which script is running.
What I need is the list of files in a directory on an external URL, which is not FTP but an HTTP URL, such as www.google.com. Is this possible in PHP?
Here is example of what I want (but FDM is C++ app)!
You can only see this if the webserver allows it
This is not possible in any language.
If a remote server does not want to list directory contents (i.e. if it's configured not to), no external script can generate one; that would be insecure.
Free download manager does not show the files in the folder, but all the links found on the web page. You can get a web page with curl, and grab all links from it (using regular expressions), then download the linked pages - that's how web-spiders are build. But you cannot get list of the files that are on the server, only the one that are linked in a publicly available web-page.
You can see server files only if the server allows that option, alternative you have to install your own script that will do that work for you indepent of the server settings. That also means that you have to have access on the server that you like to list the files.