Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
When I write web app on PHP it can works in different modes: like FCGI (php-fpm), like apache module (mod_php) and so on. In all this cases when I am editing my PHP scripts application updates immediatelly without need to restart a server. I know that Ruby web app also can works in different modes (i.e. FCGI withing Unicorn + Nginx). But I want to get an idea about all the most common ways to launch Ruby web app and techincal details about all this means (i.e. when I should restart my server to update scripts), their proc and cons.
There are many ways to write Ruby applications, classic 1990s-style CGI, FastCGI, independent HTTP-capable processes with Mongrel or Thin and the more modern, recommended approach which uses a launcher like Passenger to manage the processes more directly.
Ruby on Rails can have several operating modes. The default two are development and production. These have some important differences:
In development:
Anything in app/ or config/routes.rb is reloaded for each request.
The log/development.log is as verbose as possible, recording each query executed.
Assets are served up in their raw form, can be changed any time.
In production:
Application is loaded once and cached, any changes require a restart.
The log/production.log file contains only errors and important notifications.
Assets are minified, compressed, and served up in bundles. Even minor changes require repackaging everything.
It's generally a bad idea to be editing scripts on a production server, so the Rails approach is usually a good thing. If you need to rapidly iterate on something, do it on your local copy and push the changes to the server only when you're confident they'll work.
If you really need to emergency fix something on the server, you can edit the code, then touch tmp/restart.txt to force a reload. When you change assets you'll have to run rake assets:precompile to repackage them.
Like a lot of things, Rails makes the recommended approach easy and the risky one hard.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I'm fairly new to web development and I have only published one website before. With that website I only used some PHP and without framework. Now, I'm planning on using the laravel framework for my next, bigger, website. I'm wondering if there are differences between publishing a website when using a PHP framework? If so, what are the major differences and where can I read about it (googling has not helped me)?
You have countless options. Here are some I am aware of:
1. FTP'ing
Basically, you could just publish your site/app by FTP'ing it up to your server. The biggest issue will be the Database-changes. Here, the main influence is, whether are you able to run migrations or not? You would at least have to have ssh-access to your server with the required prerequisites installed. Otherwise, you would need to keep track of the changes in some other way, and change the db manually (which is not an good option IMO).
2. Automated deployments
There are server-deployment automation-tools, the one i know is capistrano. You can write scripts which do the deployment. These tools are also capable of running migrations, if you tell them to do so, but you would need to have ssh-access for that. Google will tell you the rest, here is a good tutorial.
2.1. Push-to-deploy
If you use SCM for your "bigger project" (which i would highly recommend), you could use push-to deploy technologies. This approarch basically uses Git-Hooks for triggering deployment-scripts. Deeployer could take care of that for you, if you do not want to develop your push-to-deploy-solution from scratch. Other alternatives are Rocketeer (Open Source) or DeployHQ (Paid)
2.2. Using laravel-taylored hosting/deployment/server-management services
Services like laravel-forge or envoyer offer zero-downtime automated deployments. I guess they are also based on push to deploy anyway. If you want to learn more about them, i recommend watching the corresponding laracasts-series (Forge, Envoyer). Be aware that they are not free.
I hope this overview gets you started. Happy deploying!
There is no difference between publishing a plain PHP website and one with a framework. Simple FTP up all your files and it will work, just remember to upload the framework files also.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I need to get the status of services across a large number of servers in order to calculate uptime percentages. I may need to use multiple servers to do the checking. Does anyone know of a reliable way to queue them to be checked at a specific time/interval?
I'm writing the application in PHP, but I'm open to using other languages and tools for this. My only requirement is that it must run on Linux.
I've looked into things like Gearman for job queuing, but I haven't found anything that would work well.
Inorder to get uptime percentages of your services you can execute commands to check status of services and log them for further analysis/calculations. Following are some of the ways of doing same:
System commands like top, free -m, vmstat, iostat, iotop, sar, netstat etc. Nothing comes near these linux utility when you are analysing/debugging a problem. These commands give you a clear picture of what is going inside your server
SeaLion: Agent executes all the commands mentioned in #1 and custom commands as well. Outputs of these commands can be accessed in a beautiful web interface. This tool comes handy when you are working across hundreds of servers as installation is clear simple. And its FREE
Nagios: It is the mother of all monitoring/alerting tools. It is very much customizable but very difficult to setup for beginners. Although there are some nagios plugins.
Munin
Server density: A cloudbased paid service that collects important Linux metrics and gives users ability to write own plugins.
New Relic: Another well known hosted monitoring service.
Zabbix
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have a php application [mostly REST] which runs on top of Apache in a Linux Virtual Machine. This application does a lot of Data queries and i have started having performance issues.
To me one way to address this is using NodeJs Async Patterns. I also plan to implement websockets. But the problem is code size in php is very large. It will take months to rewrite in Node.
Is there a middle ground to complete rewrite. Where i can handle interaction with browser in Node and interaction with database in php cli. and Node can call php cli with approximating Apache environment?
I am using Slim PHP Framework for the REST API, Both HTTP Basic Auth and PHP Sessions, $_GET and variables for extra filters on GET requests. I dont know much about internal workings of Slim. But i think it depends on Apache-PHP implementation of HTTP requests and responses.
How to send the message body [post, put] to the php cli which is in 99% cases JSON (I have file uploads too but which can be ignored as of now). i can have php cli put the json output in STDOUT and parse from there.
The real problem is how to remove dependency on php apache SAPI without changing much of the codebase and how to integrate it with Node. is there any tools, lib which can help in this case.
One more side question, can NGinx help me here somehow?
**Note - My knowledge of node is limited to few fun scripts and custom linting, template compiling, testing scripts for browser side code.
First you could put nginx in front of Apache. This will allow you to slowly transition your actions to node by routing selectively to one or the other.
Alternatively you could put node in front and use node-http-proxy with express (for exemple) to proxy selectively to Apache. I haven't tried it myself but I guess it should work.
You could also/or use dnode to call php functions from node. zeromq is an option, too.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have a client that wishes to have a website written that involves a fairly simple cms driven website that sorts and displays daily reports. The website will require subscriptions and include membership, free trials, etc...
Originally I was going to write the site in PHP, as none of the requirements are too heavy and I am very experienced in it. However, after speaking with the client, he has worked closely with someone who has a C++ product that offers a workflow that includes the entire process of handling subscriptions, logins, and trials and (apparently) can be used on a web platform.
This throws a wrench into my original plan, because even though I know C++ I have never had to deploy it on a webserver or have it communicate with PHP. I've already written a good deal of the site in PHP, so would prefer not having to re-write.
Can I have the two communicate on the same server? What would be required to do so? Would it be worth my time or should I just decide to scrap PHP and use C++? Or should I tell my client he's nuts?
That's about all the info I have about the project right now. Not sure if I can provide much more info, but will try if it's needed.
Thanks for all answers.
Tel him he is nuts.
The reason is that none of those tasks requires the benefits C++ can offer over PHP. It is heavy maintenance pain. And in the big picture putting those two together is more work (in hours) than writing those things in php.
The only thing that would justify C++ is if there is some heavy math business logic involved in there. And i mean heavy.
For problems. Just think about debugging.
In addition to what Thomas says (which is all true), your hosting company will most likely prohibit running custom binaries. Hosting packages short of virtual private server normally don't allow user-written compiled code on the Web server, only scripts.
VPS hosting is, on average, 5 times as expensive.
You can re-write the C++ code in PHP. You can also convert C++ to Java using a converter and then use the Java virtual machine if your host allows that. You can use the C++ code if your host allows that. You can host the C++ code from a local machine if that is a good idea in your case.
I would tell the client that in case there is no explicit need for the C++ language I would go with implementing PHP. You know, the communication between C++ and PHP adds to server load even if the host allows you to use the C++ module. And in the future you will have a lot of pain maintain
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I want to start automating more of my web development process so I'm looking for a build system. I write mostly PHP apps on Mac OS X and deploy Linux servers over FTP. A lot of my clients have basic hosting providers so shell access to their servers is typically not available, however remote MySQL access is usually present. Here is what I want to do with a build system:
When Building:
Lint JavaScript Files
Validate CSS Files
Validate HTML Files
Minify and concatenate JS and CSS files
Verify PHP Syntax
Set Debug/Production flags
When Deploying
Checkout latest version from SVN
Run build process
Upload files to server via FTP
Run SQL scripts on remote DB
I realize this is a lot of work to automate but I think it would be worth it. So what is the best way to start down this path? Is there a system that can handle builds and deploys, or should I search for separate solutions? What systems would you recommend?
All you ask for can be done with Phing
Phing is a deployment framework written in PHP and modeled after Apache Ant. It comes with a large set of ready-to-use deployment tasks, including database deployment, remote file transfers and VCS connectivity. If you are missing functionality, you can extend Phing with standard PHP.
Phing provides the following features:
Simple XML buildfiles
Rich set of provided tasks
Easily extendable via PHP classes
Platform-independent: works on UNIX, Windows, MacOSX
No required external dependencies
Built & optimized for ZendEngine2/PHP5
You might also want to have a look to Hudson, an extensible continuous integration server.
The supported features are available here.
Among the other languages, it supported PHP. This article presents some nice plugins for PHP. The article is also suggesting Phing for the build. See Gordon's answer for details.