Does anyone know a solution for deploying a PHP webapp behind a firewall on mainly Windows servers? We have 100+ customers who host our webapp on premise, and we would like to setup a deployer, as a part of our bitbucket pipeline, so our code gets deployed on all installations.
1 customer = 1 installation aka deployment
Today we use a small PHP script, and some version control software, to pull code changes once every day. It runs on both Linux and Windows servers.
Hit me with any solutions :)
You can make use of PHPDeployer.
You can setup SSH-access on the servers and then configure the script to deploy to the desired IP of the server.
Related
After many hours of reading documentation and messing around with Amazon Web Services. I am unable to figure out how to host a PHP page.
Currently I am using the S3 service for a basic website, but I know that this service does not support dynamic pages. I was able to use the Elastic Beanstalk to make the Sample Application running PHP. But i have really no idea how to use it. I read up on some other services but they don't seem to do what I want or they are just way to confusing.
So what I want to be able to do is host a website with amazon that has dynamic PHP pages. Is this possible and what services do you use?
For a PHP app, you really have two choices in AWS.
Elastic Beanstalk is a service that takes your code, and manages the runtime environment for you - once you've set it up, it's very easy to deploy, and you don't have to worry about managing servers - AWS does pretty much everything for you. You have less control over the environment, but if your server will run in EB then this is a pretty easy path.
EC2 is closer to conventional hosting. You need to decide how your servers are configured & deployed (what packages get installed, what version of linux, instance size, etc), your system architecture (do you have separate instances for cache or database, whether or not you need a load balancer, etc) and how you manage availability and scalability (multiple zones, multiple data centers, auto scaling rules, etc).
Now, those are all things that you can use - you dont have to. If you're just trying to learn about php in AWS, you can start with a single EC2 instance, deploy your code, and get it running in a few minutes without worring about any of the stuff in the previous paragraph. Just create an instance from the Amazon Linux AMI, install apache & php, open the appropriate ports in the firewall (AKA the EC2 security group), deploy your code, and you should be up & running.
Your Php must run on EC2 machines.
Amazon provides great tools to make life easy (Beanstalk, ECS for Docker ...) but at the end, you own EC2 machines.
There is no a such thing where you can put your Php code without worrying about anything else ;-(
If you are having problems hosting PHP sites on AWS then you can go with a service provider like Cloudways. They provide managed AWS servers with one click installs of PHP frameworks and CMS.
I need to set up cron/automatic taks on windows(shared)/linux(shared)/wamp servers.
the problem is that the project is running on multiple environments.
so what is the best way to set up cron/scheduled taks ?
Actually what i need to do is check email servers for new emails and if something found save it to the local DB. If you have any alternative other than cron job then please let me know.
Thanks.
EDIT:
As i mentioned in the question i have multiple emails/filtering , so i need to run something on background to fetch data periodically. CRON in linux and Scheduled Tasks on Windows.
But the real problem is that i am doing it on a shared hosting ( or it depends the client ) so i cannot use CRON/Scheduled Tasks.
Ex : Project is installed on GOdaddy windows shared hosting , it is a windows server so no support for CRON(normally) and they wont allow to use Scheduled Tasks.
the question is : is there any alternative for CRON/Scheduled Tasks
?
Short Answer
I don't know alternatives to CRON/ScheduledTask which would fulfill your need.
I suggest you outsource the schedule to another server, see my possibilities below.
I came up with the following possibilities:
Shared hosting with CRON jobs (easiest)
Look for a shared hosting provider who lets you add cron jobs (e.g. through webspace management). HostEurope (german) would be such a host.
You own a dedicated (virtual) server
Given you deploy this project to multiple shared servers but own a dedicated (virtual) server for yourself:
Make your script publicly available but guard it with a strong authorization mechansims. (hard-to-guess request token, white-list certain IPs as callees, ...).
Set up a cron job on your own server which calls the script on the client's webhost.
You don't own a server
As the last possibility but you don't own a dedicated server.
Setup a virtual machine at some cloud provider (e.g. OpenShift) and add the cronjob there. Don't use this instance for other jobs and it should fullfill your needs perfectly (Reference: https://openshift.redhat.com/community/blogs/getting-started-with-cron-jobs-on-openshift)
Requirements don't meet infrastructure (likely!)
Your client/project has requirements which don't fit a shared hosting environment. You are strongly encouraged to get a hosting plan fulfilling your true needs. Price differences between shared hosting and first virtual servers or dedicated hosting aren't so steep that investigating is out of question.
As everyone else suggested if this is a web page you can use wget to run it.
If it is a CLI script you will have to run it with php /directory/filepath.php
If the actual question is HOW you are going to periodically run it you will have to use
cron on *NIX and scheduled tasks on windows server.
If you want to automatically install the cron you will have to check the OS and act depending on whether the OS is windows or *NIX.
A google search will give you results on how to do it in both environments.
Edit after reds clarification
As Samuel Herzog pretty nicely says, on shared hosting you (usually) have a control panel.
Most well known for linux are:
Cpanel: http://www.siteground.com/tutorials/cpanel/cron_jobs.htm
Plesk: http://www.hosting.com/support/plesk/crontab
Webmin: Setting up a cron job with Webmin
And for windows I only have some familiatiry with plesk for which the procedure is the same as before.
If you dont have control panel but have shell access (linux) you could follow this tutorial.
If you dont have control panel but have remote desktop (windows) you could follow this tutorial.
If you dont have any of the above you should follow Samuel Herzog suggestion about a vm on a cloud provider or consider upgrading to a VPS or a dedicated server.
Is there any way that I can set up Apache to run locally, specifically NOT connecting to the internet, so that it may serve dynamic content (PHP) over a LAN?
I'm trying to set up a development environment on my Windows XP SP3 box and gain some experience with building web PHP driven web applications. I have residential Rogers service, and it's a violation of the TOS to have a server running over that connection.
Umm yes. Just download apache and fire it up. The only way it's going to get to the outside is if you specifically open up ports in your firewall/router to let http traffic in and route it to your machine. And if you're serious about getting some experience, ditch XP and get a quality linux distro on your "development" box. You can always remote desktop or ssh to it from a windows machine if you feel more comfortable that way.
My experience has been that many hosting companies use CentOs for their client servers, so I'd recommend trying that first if the purpose is gaining useful experience.
Oh by the way, Linux happens to be free.
You can use WAMP. It installs everything you'll need to get a testing server up and running in minutes
Using a packaged solution like WAMP or XAMPP will provide you with the basics for setting up an Apache web server + PHP + MySQL + the phpMyAdmin interface for working with MySQL outside of the command line.
I’ve been working on a cloud based (AWS EC2 ) PHP Web Application, and I’m struggling with one issue when it comes to working with multiple servers (all under an AWS Elastic Load Balancer). On one server, when I upload the latest files, they’re instantly in production across the entire application. But this isn’t true when using multiple servers – you have to upload files to each of them, every time you commit a change. This could work alright if you don’t update anything very often, or if you just have one or two servers. But what if you update the system multiple times in one week, across ten servers?
What I’m looking for is a way to ‘commit’ changes from our dev or testing server and have it ‘pushed’ out to all of our production servers immediately. Ideally the update would be applied to only one server at a time (even though it just takes a second or two per server) so the ELB will not send traffic to it while files are changing so as not to disrupt any production traffic that may be flowing to the ELB .
What is the best way of doing this? One of my thoughts would be to use SVN on the dev server, but that doesn’t really ‘push’ to the servers. I’m looking for a process that takes just a few seconds to commit an update and subsequently begin applying it to servers. Also, for those of you familiar with AWS , what’s the best way to update an AMI with the latest updates so the auto-scaler always launches new instances with the latest version of the software?
There have to be good ways of doing this….can’t really picture sites like Facebook, Google, Apple, Amazon, Twitter, etc. going through and updating hundreds or thousands of servers manually and one by one when they make a change.
Thanks in advance for your help. I’m hoping we can find some solution to this problem….what has to be at least 100 Google searches by both myself and my business partner over the last day have proven unsuccessful for the most part in solving this problem.
Alex
We use scalr.net to manage our web servers and load balancer instances. It worked pretty well until now. we have a server farm for each of our environments (2 production farms, staging, sandbox). We have a pre configured roles for a web servers so it's super easy to open new instances and scale when needed. the web server pull code from github when it boots up.
We haven't completed all the deployment changes we want to do, but basically here's how we deploy new versions into our production environment:
we use phing to update the source code and deployment on each web service. we created a task that execute a git pull and run database changes (dbdeploy phing task). http://www.phing.info/trac/
we wrote a shell script that executes phing and we added it to scalr as a script. Scalr has a nice interface to manage scripts.
#!/bin/sh
cd /var/www
phing -f /var/www/build.xml -Denvironment=production deploy
scalr has an option to execute scripts on all the instances in a specific farm, so each release we just push to the master branch in github and execute the scalr script.
We want to create a github hook that deploys automatically when we push to the master branch. Scalr has api that can execute scripts, so it's possible.
Have a good look at KwateeSDCM. It enables you to deploy files and software on any number of servers and, if needed, to customize server-specific parameters along the way. There's a post about deploying a web application on multiple tomcat instances but it's language agnostic and will work for PHP just as well as long as you have ssh enabled on your AWS servers.
How does one do team development with node.js when all the developers develop on the same dev machine?
Right now the dev server setup has nginx and apache. Developers SSH into the dev server and they have their own subdomained sandboxes to work on (database is shared). They hack their code and they check into the SVN repo. Great, works fine....until we started using node.js.
It seems node is not like apache or nginx where there's an independent server that serves up code. In node, the server AND the app code is tied together, so what happens is each developer will need to start and stop the server when changes are made. This creates a problem if one instance is started, it blocks the port for other developers.
I'm also having trouble figuring out how to put the node code into the same SVN repository as the PHP app code.
A friend told me the developers can do "timesharing" where the node code can only be modified by someone in a specific timeframe. Not sure if this process is scalable.
Another option is to have everyone work locally off their computer with a VM copy of the dev server so they can develop independent of the dev server. This requires a lot of infrastructure change and I'm not ready to do that yet.
Any suggestions on how to do this with the current shared dev environment setup?
Also, the reason why we are using node.js is to have good comet support. But if this is becoming a blockage to our current infrastructure, I'm willing to try other technologies and servers that is similar to how nginx or apache works--so that it is independent of the app code and can be compatible with our current development environment.
PS. I tried the nginx http push module. It's not well-maintained and not many updates. Scared to use it in production.
You could have each developer's instance of Node.JS running on a different port.