AWS S3 server problem with files downloading in script PHP - php

Maybe I will find some help here. We're trying since several days to complete file downloading from s3 in PHP laravel with the apache app server. The readStream() or download() function of the drive s3 class is not working giving a 504 gateway timeout error. I think there's an access problem. Two important points are that the upload is working normally and the download is working in local application with local server. I'm not in charge of the server management and I don't have many informations about the server but to access to FTP (via FileZilla for example) the external team in a foreign country has to whitelist my IP. Maybe there's is a firewall issue? Since it's working in local application, there is surely a problem with the configuration of the server? We're working for several days on this problem and I really need someone with more knowledge about these types of issues.
Thanks.

Related

WordPress site taking too long to load after migrating to remote database

I have a WordPress site running on Digital Ocean droplet with MySQL database on the same server. I am trying to migrate the database to a remote database. I used Digital Ocean's managed SQL service in the same region for it, but after changing the database configuration in wp-config.php, the site is taking 30+ seconds to load now.
I also tried using the GCP's Cloud SQL but facing the same issue.
I would suggest first confirming if you can reach the remote SQL server on your own and what's the response time from the other server.
If everything seems fine and both servers are in the same region, ask DigitalOcean if it's possible for the two servers to communicate in the private network instead of connecting through public IPs.
Finally, if you are still experiencing this issue, I would suggest reaching their support and explaining this issue. They should be able to help if it's a "managed" service.

Using php with angular application hosted on AWS

I'm new to AWS. I created a DB and I am currently hosting the website right now. This is great!
The problem is that I don't know how to integrate my PHP files that I was using before the start of AWS.
My previous stack was: Angular, MyPHP, MySQL, Apache. Everything was running locally. So I was able to use the same IP for the database and to also run php files.
What I tried to do was upload the php files into the S3 bucket. This doesn't work. The website will only download the file and not execute.
So how would I implement the same fluid way of integrating PHP on AWS as I did on my local computer?
If you want to do this on AWS you will need an EC2 instance to run your dynamic queries to the database. S3 will only host static files or programming which works on the client-side (JavaScript).
If it is a basic website, you might want to consider LightSail
My suggestions would be,
use EC2 instead of using s3 for hosting the PHP website as you can have complete control over the server.
For setting up EC2 with PHP please follow the steps in the following article.
Setting up EC2(ubuntu instance) for php

What kind of remote server service was I using?

I’m part of a very small company that uses a database hosted on a server (104.131.##.###). However, the server no longer responds and the person who set up and owns the server space has already left the company. This past employee seems disgruntled so they won’t help. It’s complicated, but we decided to open a new server. The only issue is, I don’t know what we were using.
What I do know is I would access and change the database at http://104.131.##.###/phpmyadmin/ (image of login below)
I also had php files stored on the server using Filezilla (in a “var” folder, if that helps) which were accessed via path: http://104.131.96.###/path/to/file.php
I’ve set up a version of the same server using xampp on my own computer, but I can’t keep my computer running constantly.
So my question is, what service were we using / should we use? Where would I start to set up a new server like this? (I still have the php files and can recreate the db)
I've looked into AWS and digitalOcean, but I'm in a bit over my head and can't tell if they're offering what we need.
Any help would be appreciated. Thank you
The server was using phpmyadmin and mysql and Digital Ocean.
So you need at least a LAMP stack. With the info given we can't help you more.

PHP shell_exec doesn't work in Windows Azure

We need to execute an .exe file in a remote Windows Azure Server.
We call it from PHP with shell_exec. The .exe should create new files in two different folders into the server and generate data entries in a data base and returns a string, but it doesn’t work.
We don’t have any problem executing it in our local server with windows 7 Enterprise and IIS 7. That’s why we thought it could be a permissions problem, and then we have created a .user.ini file with the following content:
safe_mode= off
safe_mode_exec_dir= off
Unfortunately it doesn’t work too.
Any suggestions?
Thanks in advance.
You are most probably working with Windows Azure Web Sites. This is a high-density shared hosting with tightened security. If you need things like shell_exec you shall move to Windows Azure Cloud Service (Web Role), where you have full control over the OS and web server / php settings.
Using Cloud Service you will be able to use shell_exec. However when you move to Cloud Service you have begin thinking of saving files in Azure Blob Storage, as the local storage for cloud service is:
Not persistent
not synchronized across role instances
If you want don't yet know what Role and Role Instance is, or are little confused, please go through this article.

Is it possible write a PHP script that transfers through FTP a file from a user's computer to an FTP server without using HTML file upload?

I'm not sure how common my request is but here goes:
I have a client who wants to be able to receive files of up to 2GB in size from their customers. The reason why the files are kind of big is because they are graphic design files. Ideally, they would have their customers transfer their files via FTP using an FTP client like Filezilla. However my client has stated that they spend way too much time trying to school people on how to enter FTP credentials into an FTP program and the concept of FTP in general.
So ultimately my client has stated that they would like the customer to be able to use a web interface, which they're already familiar with, to be able to accomplish the same task. For example, they'd like the customer to be able to use a form and hit an upload file button. Simple as that.
At this point I should state that I'm working on a WordPress site on top of a Rackspace Cloud Sites server (shared hosting).
I'm using a WordPress plugin that allows me to do this and it works for small files but not for files approaching 500MB. After speaking to a RS Cloud tech support person I've narrowed it down to the temporary directory /tmp/ in the Apache server. That is the server is not able to write large files into that temporary directory because of their server wide restrictions. That is, they cannot change this to accomodate my needs. They said I would need a dedicated server for that, which, is not an option for me.
After doing some thinking though, I've come to the conclusion that it's silly for me to have to upload a file to the server's temporary directory, only to move the file to the ftp server. So this brings me to my question: is it possible for a web based PHP script to send the file directly from the user's machine, bypass the web server, and send it directly to the FTP server?
If not, do you have any other suggestions? Any help would be appreciated. Thanks.
No, it's not possible at all.
My suggestion? Search and learn how to use HTML5 upload for large files.
Its seams like someone find solution for your problem. Please refer to this question:
Stream FTP download to output
The question was how to stream/pipe file from FTP thru HTTP to the user's browser.
It seams like the answer from #strager is what you need.

Categories