Edit php file on AWS Elastic Beanstalk - php

When I did run my web site on my old server, I launched Transmit on my Mac (OS X 10.11.6), connected to my server, Control-Click-Open the remote php file, made the fix and save. The file got updated on the server in a second. That was great to run some php/mysql/google_service test that I can't run locally.
Now I have just moved my project on an Amazon server, AWS. Every time I need to run a test (for example on the S3_Bucket, that I can't run locally), or modify a variable, change a flag... I have to do it on my local php/html/java/css/apis project, zip it, upload it via the Elastic Beanstalk panel, wait about half a minute, then run it. I have found no way to edit a single file in an easy way (Open, Write, Save) as I did before through Transmit. I can't go ahead this way. It's wasting my time.
Do you know any better way to develop/test/run my project on AWS?

Have you considered using Docker?
https://aws.amazon.com/about-aws/whats-new/2015/04/aws-elastic-beanstalk-cli-supports-local-development-and-testing-for-docker-containers/
Or use MAMP?
https://www.mamp.info/en/

Related

How to set up online directory that two servers can access?

I have a very weird, but specific situation. I use XAMPP localhost on my mac, call it server1, and heroku running a php app, call it server2. I need to move a txt file from server1 to server2 (at regular intervals as the txt constantly updates). I cannot use PHP's ftp as heroku doesn't like that. I have no idea how to do this.
I have come up with a plan to somehow get the txt from server1 to an online 'directory' that the app on server2 can access, but I have no idea how to do this, or if this is even possible? Is there a better way to transfer the file? Should I not be using heroku for this in the first place?
Heroku is great for running PHP apps, but it has an ephemeral filesystem. It's not build to upload files and store them on the "server".
(https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem)
If you deploy a new version of your app to Heroku, the uploaded file will be removed. If you don't mind uploading the file again, simply push it to your PHP app running on Heroku using a POST request.
If you want to save it more reliable, think about using a storage service like AWS S3.

How to run PHP scripts on server without updating server's files

I need to run PHP scripts on server without having to update files it stores - perform testing on real server before deployment. Server has access to database which is inaccessible from outside. For this reason I can't run my scripts locally, I need to run them within server's environment, but I don't want to update files stored on server. Is there any way to do so? Is there a tool for remote PHP debugging?
There are several ways to achieve this thing
You can export database from live server and import it to your local server for testing and debugging purpose.
You can Upload code into a separate folder or subdomain on server to connect with database and test with live server configurations. Once you are satisfied replace live files.
I have found a solution - XDebug for PHP. However as was mentioned in comments, testing against copy of DB and using virtualization is a more common approach, which I personally will probably stick to.
For those who are still determined to go the "hard" way, here is a link to HOWTO on XDebug installation for PHP on Ubuntu - http://ubuntuforums.org/showthread.php?t=525257.

how to download file from heroku with git

How can I download my currently changed file from heroku server?
I built a PHP application that is running on Heroku.
If I use $ git clone git#heroku.com:myappli.git -o heroku then would this upload my original project files from my computer to the Heroku server?
If I use $ heroku git:clone -a myappli then would this download the whole project files to my computer?
How can I download my file (logfilled.txt) from the Heroku server?
Like most PaaS providers, Heroku does not provide a persistent filesystem:
Ephemeral filesystem
Each dyno gets its own ephemeral filesystem, with a fresh copy of the most recently deployed code. During the dyno’s lifetime its running processes can use the filesystem as a temporary scratchpad, but no files that are written are visible to processes in any other dyno and any files written will be discarded the moment the dyno is stopped or restarted.
This means that every time you deploy files that you have created or modified will be lost or reverted to the last committed state. It is probably not a great idea to create your own log files on Heroku.
However, Heroku automaticaly logs anything printed to standard out or standard error:
Writing to your log
Anything written to standard out (stdout) or standard error (stderr) is captured into your logs. This means that you can log from anywhere in your application code with a simple output statement.
Logs may be retrieved using the heroku logs command.
Try using PHP's error_log function to write your logs. If you are using a logging library like Monolog you may have to configure it to output to php://stderr instead of to a file.
Finally, you could write to an arbitrary file like logfilled.txt and make that accessible via HTTP, then download it using a regular web browser, wget, curl, or any other tool. Note that you will almost certainly want to build some authentication around this; using Heroku's logging facility is a much better option.

Heroku commands inside the app

I have an app on Heroku, using PHP and PostgreSQL. Now I would like to create backup of my database regularly, put it on a folder on the server, or record its S3 urls and download it.
I have been doing research on the topic. It seems that the best is to use pgbackups add-on which I already have and can use on local command line, like: heroku pgbackups:url --app=APP_NAME
I want to automate the process, lets say in a cron job. I see we have workers on Heroku, but I have never used them and this is still a development environment. A free plan does not have workers. Besides, my app doesnt really require background workers. I dont want to buy worker dynos only for automatic database backups. Which way should I go?
If I can create PHP cron jobs on Heroku, then I need to know: How can I run Heroku commands in PHP? I tried exec and passthru, but none of them seems to work on Heroku server. On my localhost, the above command (heroku pgbackups) works pretty well, providing the Heroku toolbelt installed on local server.
For Ruby, they have https://github.com/heroku/ toolkit for server-side commands. But I had no luck in my search for a PHP branch...
The overall purpose is to have the DB backup and store it on the server and download it. (Even though Heroku makes backups itself, we want to see it in our hands :)
How can I make it happen?
Probably the best thing to do is to have a cron job on your backup server to run heroku pgbackups:url and to get a URL to the latest pgbackup, and then download it with curl. Something like this:
curl $(heroku pgbackups:url) > latest_backup
For more info about heroku pgbackups:url, see:
https://devcenter.heroku.com/articles/pgbackups#downloading-a-backup
Doing anything with worker dynos wouldn't really make sense because that wouldn't really help you get the backup to your backup server unless you were downloading and re-uploading it or something. Just running a cron job on your backup server downloading once is a lot more straight forward.

Upload a folder via FTP using PHP

Our website currently backs up every night to a seperate server that we have which is fine, but when we go to dowload the files the next day it take's a long time to download the files (usually around 36,000+ images). Downloading this the following day takes quite some time and affects the speeds of everyone else using our network so ideally we would try and do this in the middle of the night - except there's no-one here to do it.
The server that the backup is on is running Cpanel which appears to make it fairly simple to run a PHP file as a Cron job.
I'm assuming the following, feel free to tell me I'm wrong.
1) The server the backup is on runs Cpanel. It appears that it shouldn't be too difficult to set up a PHP script to run as a Cron job in the middle of the night.
2) We could deploy a PHP script utilizing the FTP functions to connect to another server and start the backup of these files using this cron job.
3) We are running Xampp on a windows platform. It has Filezilla as part of it so I'm assuming it should be able to accept incoming FTP connections.
4) Overall - we could deploy a script on the backup server that would run every night and send the files back to my local computer running Xampp.
So that's what I'm guessing. I'm getting stuck at the first hurdle though. I've tried to create a script that runs on our local computer and sends a specified folder to the backup server when it executes, but all I seem to be able to find is scripts relating to single files. Although I've some experience of PHP, I haven't touched upon the FTP functions before which are giving me some problems. I've tried the other examples here on stack overflow with no success :(
I'm just looking for the most simplistic form of a script that can transfer upload a folder to a remote IP. Any help would be appreciated.
There is a fair amount of overhead involved in transferring a bunch of small files over FTP. Ive seen jobs take 5x as long, over a local network. It is by far easier to pack the files in something like a zip and send them in one large file.
you can use exec() to run zip from the command line (or whatever compression tool you prefer). After that, you can send it over ftp pretty quickly (you said you found methods for transferring 1 file). For backup purposes, having the files zipped would probably makes things easier to handle, but if you need them unzipped you can setup a job on the other machine to unpack the file.

Categories