How does one do team development with node.js when all the developers develop on the same dev machine?
Right now the dev server setup has nginx and apache. Developers SSH into the dev server and they have their own subdomained sandboxes to work on (database is shared). They hack their code and they check into the SVN repo. Great, works fine....until we started using node.js.
It seems node is not like apache or nginx where there's an independent server that serves up code. In node, the server AND the app code is tied together, so what happens is each developer will need to start and stop the server when changes are made. This creates a problem if one instance is started, it blocks the port for other developers.
I'm also having trouble figuring out how to put the node code into the same SVN repository as the PHP app code.
A friend told me the developers can do "timesharing" where the node code can only be modified by someone in a specific timeframe. Not sure if this process is scalable.
Another option is to have everyone work locally off their computer with a VM copy of the dev server so they can develop independent of the dev server. This requires a lot of infrastructure change and I'm not ready to do that yet.
Any suggestions on how to do this with the current shared dev environment setup?
Also, the reason why we are using node.js is to have good comet support. But if this is becoming a blockage to our current infrastructure, I'm willing to try other technologies and servers that is similar to how nginx or apache works--so that it is independent of the app code and can be compatible with our current development environment.
PS. I tried the nginx http push module. It's not well-maintained and not many updates. Scared to use it in production.
You could have each developer's instance of Node.JS running on a different port.
Related
Does anyone know a solution for deploying a PHP webapp behind a firewall on mainly Windows servers? We have 100+ customers who host our webapp on premise, and we would like to setup a deployer, as a part of our bitbucket pipeline, so our code gets deployed on all installations.
1 customer = 1 installation aka deployment
Today we use a small PHP script, and some version control software, to pull code changes once every day. It runs on both Linux and Windows servers.
Hit me with any solutions :)
You can make use of PHPDeployer.
You can setup SSH-access on the servers and then configure the script to deploy to the desired IP of the server.
The situation
I have been developing in php and using wamp for the past 2 years. Then I come across a module to implement a chat system followed by instant notifications. So I go look it up and found this awesome "nodejs" that allows you to connect to connected users in realtime.
This guy nodejs socket.io and php uploaded a way to integrate nodejs socket.io and php without node server.
So I downloaded his project (github) and ran it on my computer but it gave
connection refused error from 8080 So,
I go to nodejs site and install nodejs on my system (windows). It automatically updated my environment variables and I could just go to my command line to run a example project as
path(...)node nodeServer.js
and then run the index file of the project from the shared link and it starts working. everything runs smooth and nice.
MY QUESTION
If without installing nodejs on my system I cannot run the node app in the small example project then how am I supposed to install nodejs on live server (apache) and use command line to start nodejs.
I know this might be too silly but I am really new to nodejs so I don't know if I can run node on live php server. If it is possible then can anyone tell me how can I do that ? or is it just an ideal situation and can't be done.
Node.js do not need to be installed with Apache. Node.js itself provide a server that would listen on a port. You can use Apache or Nginx to use proxy. You can run your application without these server also.
Create a file index.js using the code below and run node index.js
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(1337, '127.0.0.1');
console.log('Server running at http://127.0.0.1:1337/');
Open you browser and enter this url : http://127.0.0.1:1337/ You will see Hello World over there. In this case nodejs is listening on port 1337
If you are using cloud or VPS or any kind of solution that allows you full control of stuff installed, you can just install node.js there and run what you need...
https://github.com/joyent/node/wiki/installing-node.js-via-package-manager
some services will allow you to pick what gets installed... so you just pick nodejs and run it alongside your apache.
However, if you are using shared hosting solution, there is limited number of those actually even hosting node (if any) and solving this would be almost impossible for you.
Second Edit: Sorry for editing twice, but there is a thing with "no nodejs server" in mentioned stackoverflow post - there is actually a server and mentioned need to npm install certain modules... this is not right way to do this, but if you still want to try this you need node installed (and npm along with it) and then you need to npm isntall mentioned packages, add simple server file quoted in the post, run it and then have all you need for your chat...
If you need some help, ping me, but if this is time critical project, rather find some third party solution... and then learn about this one.
TLDR find a hosting service that'll give u admin and support firewall requests, or self host w/ a free dns subdomain and have a script update your ip so you dont have to pay for static.
My Experiences:
You can actually utilize node for input/output stream manipulation as well. Look at gulp and node for more info. Using bower and bluebird on top of a git project makes setting up apps very easy and quick via node.
As for using socket.io w/ a node/wamp setup, I've actually used this in the past. I had wamp installed on the server initially, but I used the apache directives to reverse proxy requests on 8080 to the node.js app from the client scripts.
I did have to install node separately, though, so you'll need something like ramnode maybe (I think they allow hosted apps like iis/mvc etc too).
Easiest hosting setup for development imo was self host wamp/node w/ a free subdomain from afraid.dns.
Otherwise ramnode gives you full access to admin features on your vm, i believe. So you may be able to install node there as long as you request firewall permissions when needed for xtra ports (socket.io used diff ports for requests on page so I didnt have to worry about CORs crap or anything).
Is there any way that I can set up Apache to run locally, specifically NOT connecting to the internet, so that it may serve dynamic content (PHP) over a LAN?
I'm trying to set up a development environment on my Windows XP SP3 box and gain some experience with building web PHP driven web applications. I have residential Rogers service, and it's a violation of the TOS to have a server running over that connection.
Umm yes. Just download apache and fire it up. The only way it's going to get to the outside is if you specifically open up ports in your firewall/router to let http traffic in and route it to your machine. And if you're serious about getting some experience, ditch XP and get a quality linux distro on your "development" box. You can always remote desktop or ssh to it from a windows machine if you feel more comfortable that way.
My experience has been that many hosting companies use CentOs for their client servers, so I'd recommend trying that first if the purpose is gaining useful experience.
Oh by the way, Linux happens to be free.
You can use WAMP. It installs everything you'll need to get a testing server up and running in minutes
Using a packaged solution like WAMP or XAMPP will provide you with the basics for setting up an Apache web server + PHP + MySQL + the phpMyAdmin interface for working with MySQL outside of the command line.
I’ve been working on a cloud based (AWS EC2 ) PHP Web Application, and I’m struggling with one issue when it comes to working with multiple servers (all under an AWS Elastic Load Balancer). On one server, when I upload the latest files, they’re instantly in production across the entire application. But this isn’t true when using multiple servers – you have to upload files to each of them, every time you commit a change. This could work alright if you don’t update anything very often, or if you just have one or two servers. But what if you update the system multiple times in one week, across ten servers?
What I’m looking for is a way to ‘commit’ changes from our dev or testing server and have it ‘pushed’ out to all of our production servers immediately. Ideally the update would be applied to only one server at a time (even though it just takes a second or two per server) so the ELB will not send traffic to it while files are changing so as not to disrupt any production traffic that may be flowing to the ELB .
What is the best way of doing this? One of my thoughts would be to use SVN on the dev server, but that doesn’t really ‘push’ to the servers. I’m looking for a process that takes just a few seconds to commit an update and subsequently begin applying it to servers. Also, for those of you familiar with AWS , what’s the best way to update an AMI with the latest updates so the auto-scaler always launches new instances with the latest version of the software?
There have to be good ways of doing this….can’t really picture sites like Facebook, Google, Apple, Amazon, Twitter, etc. going through and updating hundreds or thousands of servers manually and one by one when they make a change.
Thanks in advance for your help. I’m hoping we can find some solution to this problem….what has to be at least 100 Google searches by both myself and my business partner over the last day have proven unsuccessful for the most part in solving this problem.
Alex
We use scalr.net to manage our web servers and load balancer instances. It worked pretty well until now. we have a server farm for each of our environments (2 production farms, staging, sandbox). We have a pre configured roles for a web servers so it's super easy to open new instances and scale when needed. the web server pull code from github when it boots up.
We haven't completed all the deployment changes we want to do, but basically here's how we deploy new versions into our production environment:
we use phing to update the source code and deployment on each web service. we created a task that execute a git pull and run database changes (dbdeploy phing task). http://www.phing.info/trac/
we wrote a shell script that executes phing and we added it to scalr as a script. Scalr has a nice interface to manage scripts.
#!/bin/sh
cd /var/www
phing -f /var/www/build.xml -Denvironment=production deploy
scalr has an option to execute scripts on all the instances in a specific farm, so each release we just push to the master branch in github and execute the scalr script.
We want to create a github hook that deploys automatically when we push to the master branch. Scalr has api that can execute scripts, so it's possible.
Have a good look at KwateeSDCM. It enables you to deploy files and software on any number of servers and, if needed, to customize server-specific parameters along the way. There's a post about deploying a web application on multiple tomcat instances but it's language agnostic and will work for PHP just as well as long as you have ssh enabled on your AWS servers.
I have two computers that I use for development, one at home and one at the office. I use Aptana Studio 3 on both machines and would like to be able to easily work on a single project from both computers.
What are some easy ways to transport the project between computers. Right now I am just using a USB drive to transfer the files between computers.
Also, I'm using a local apache server on one computer, and a local IIS server on the other.
I think you should use something like SVN, GIT, Mercurial and so on. I suggest you this software to manage your project:
Tortoise
You might try connecting remotely (via Remote Desktop Connection for example) from home to the office computer and in this way you will only work on the office one and there will be no need for file transfer.
Alternatively, you can setup a source control server (by using SVN for example) and commit your projects to it. This way, you will be able to work on them from multiple locations.
if you have the opportunity to use a (virtual) server, and remote desktop from both computers, you won't come back to another solution. Overall with a server-side language like php, it's ideal, as you have your repository on your test web server (lamp/wamp) directly. That ensure ONE version for all your tools, easy and faster backups, synch, etc.
If I had no access to a virtual remote server, I would use github as an alternative, for the sake of code base security, and decent synch times. But I'm no expert on github.
Staging Environment
The best situation in your case will be to have a separate staging machine which is accessible to all three development machines. This can be another machine you control, one of the development boxes available to all of the others, or an external host - you can get a dedicated virtual machine from cloud providers for as little at $10/month, or if your project is simple enough use a plain web host even more cost-effectively.
Source Code Manager
A source code manager is a good start - SVN is common and free. Git is another and can even be set up to do remote deployments. These tools will give you two benefits:
Shared code between all environments, always up to date
Protection against data loss and error - if something major breaks revert to a working copy
File Synchronization
Finally, a tool like Dropbox can synchronize your files across all three systems as you make changes. Again, this one is free and can be installed on most operating systems.