I am working on a site that is hosted on
goDaddy, through cPanel
The client wants to transition from their old PHP server to a node.js system.
They would like to implement new code in phases while leaving the old site up and running. The old and new code would be running on the same server.
I have a good break point for phase 1, but am not sure how to allow the PHP and node code to run simultaneously and listening for requests on the same server. I am familiar with node, but not as much with PHP.
In short- Can I have PHP and Node.js running simultaneously on the same server? If so, what considerations need to be made?
Thank you in advance!
You will most likely want to make it to where you migrate to the node.js service one endpoint at a time. That way you can test, debug, and fix things quickly without too much work. I recommend you use express for your router and whatever database connector you want. You will want to canary test between the two as well.
Related
i want to use AMP PHP and create a Project. So i started with one of the examples on github and i can see hello world.
Now if i make changes to my code, i have to restart everytime the server. but this is not how it should work right?
Do i have to run some kind of filewatcher which restarts the server everytime i change the code? or should the AMP PHP Server work as Proxy which then call php-fpm instances like an NGINX server would do? If so, can i use the async libraries without the Loop? (since the loop is on server)
How to work the framework? it seems that i understand here something wrong.
Best regards
Yes, you'll need to restart the server on changes. You can use a file watcher to do this automatically. PHP doesn't provide a hotreload feature currently.
You can't use cooperative multitasking without a scheduler / event loop, no.
Let me start off by saying that I know this is not the preferred way to run python, but I have had this website for several years and am looking to add additional functionality. If I try to move the site to a new host and server setup, I am afraid I will mess everything up.
I am using Godaddy shared server for my website, and I access it using cpanel. The website is a Wordpress blog but also has a few tools I built using PHP and SQL database to store the output. I want to create a chatbot using Python but from what I understand, I can't use Django on a shared Godaddy server.
Is there a way for me to run Python scripts given my limitations?
Is the best alternative for me to start a second server and build an API to process the conversation and send it back to my current website?
Shared hosting solutions tend to limit the software that can run on then. The last time I used GoDaddy, they had only a php stack, so probably no, you won't be able to use Python there.
But that's fine, you shouldn't!
If you plan on using Python, I recommend you to get a Vps, or switch to a cloud service, like Openshift.
You can find cheap and reliable Vps servers nowadays, so go for it.
I have been running a PHP site for years on my own servers. I recently purchased a dedicated server package and am trying to move my site to the dedicated server. I recently upgraded to PHP 5, and my current server is running PHP 5.6.16. I moved the files and the database, and put it in a live test domain, but the site is not functioning properly on the new dedicated server. Several key scripts are non-functional. I made sure that the dedicated server is running a version of 5.6. I have configured it to the same settings I have on the old server. I can see that the site is talking to the MySQL database. I turned on error reporting and I see no significant errors suggesting why these important scripts are now non-functional. I made sure the include path is there, and if it wasn't nothing would work. What am I overlooking? What could be different between one server and the other that might impact PHP functionality? I'm basically at my wits end here, so if these seems stupid please forgive me, but I don't know where to look next.
Start with the basics.
Does your web server respond to static page requests?
Is your web server configured to use PHP?
Can your web server execute and/or connect to PHP?
If you have a simple script with <?php phpinfo(); in it, does it work?
Are all the expected modules there in your phpinfo() output?
Do you have rewrite rules that need to be reconfigured? (Check your web server error log. Check your response status codes.)
Assuming PHP is all good, move into your application.
Are you absolutely sure error logging is on? (Again, check phpinfo() output. Try to force an error, maybe a syntax error or something and see if you see the error.)
How do you know your application is connecting to MySQL?
Start with a basic script that just echos some things.
Comment out large swaths of code and see if you can narrow down the problem that way, re-enabling chunks as you go. (You want to bi-sect the problem, cutting in half and in half and in half until you figure out exactly what the issue is.)
Other system-level things to check...
File system permissions? (See also https://serverfault.com/questions/48587/is-there-a-linux-log-for-when-a-user-is-denied-access-to-files-due-to-permission, for Linux.)
Firewalls? (Are you sure you can actually access MySQL over the network?)
Disk? (Are you out of space? Are your partitions set up correct? Is /tmp full?)
Once you figure out the problem, some advice:
Do this sort of thing regularly. Write a provisioning script to build yourself a new machine from one command, and do it regularly. These days with cloud providers (physical hardware or not) there's no reason you can't blow away your application servers on a regular basis, and re-provision them. Consider making this your system upgrade strategy. (Why reboot to get a new kernel when you can just have a whole new server with the new kernel and other patches, that you can cut over to?)
Ensure your development environment closely matches your production environment. (Consider Vagrant for this.)
You're using version control, right? If not, start using version control so that you can hack on your code for things like this and easily roll back when done.
Looking for some suggestions on best way / possibility of implementing offsite backup of the data in my php app.
So we have an PHP app that runs on the clients local site, which dumps the MySQL data to a datefile.sql each night. what's the best way to get this moved to an external server.
We host a server that currently we manually FTP files to each morning. How best can we automate this, would we need to hard code in FTP credentials, what if we had multiple clients how could we separate out this so no hard coded credentials are needed.
The ideal situation would be to have a MySQL instance running on the external server that the local clients server replicates the data across to this on the fly and back if required. Not even sure that's possible?
Happy to try and explain further if needed.
Thanks,
Any ideas?
you could create a bash script running on your server, called by a cron at night, that uses rsync to fetch the sql file from the clients servers (if you have an ssh connection with them), and restore it on your own machine.
You can achieve this using cron. Just create a cronjob and schedule it to run when you need it to. For all the file-transfering hasle, you may use rsync (which also provides ways to transfer only different data etc).
Also, I think that MySQL has a build-in feature for replication and backups, but I'm not sure about this or how to configure it..
I'm currently developing a PHP application that is going to use websockets for client-server communication. I've heard numerous times that PHP shouldn't be used for server applications because of the lack of threading mechanisms, its memory-management (cyclic references) or the unhandy socket library.
So far, everything is working quite well. I'm using phpws as the websocket library and the Doctrine DBAL to access different database systems; PHP is version 5.3.8 . The server should serve a maximum of 30 clients. Yet especially in the last days I've read several articles stating the ineffectiveness of PHP for long running applications.
Now I'm not aware whether I should continue using websockets with PHP or rebuild the entire serverside application. I've tried Python with Socket.IO, though I did not get the results I expected.
I guess I have the following options:
Keep everything as it is.
Make the application use Ajax in combination with Socket.IO - e.g. run a serverside script that invokes the client's ajax calls when data is submitted to the server.
The last point sounds quite interesting, though it would require some work .. Would it be a problem for servers to execute all the clients requests at one time?
What would you recommend? Is the problem with PHP's memory management (I'm using gc_collect each time a client sends data to the server) still valid? Are there other reasons beside the obvious reasons (no threading, ...) for not using PHP as a server?
You can try running your socket.io on a node server on another port on your server (that is if you are not using a hosting plan like goDaddy).
I am using it and the performances are really satisfying.
I have an apache server on the port 80 serving my php files, and my server-client communications are done using a Node.js server running socket.io on the port 8080 (dev) or 843 (prod).
Node.js is really light and has great performance, but you need to run it as a server. Nodejitsu.com is a hosting solution that has the websocket protocol available and is on beta, so it is still free for now. Just note that you need to listen on the port 80 with socket.io, this is a limitation from theyr network.
If you want your pages all to be accessed on the port 80 then you will need a reverse proxy like varnish .
I hope that helps! Have a nice day.
Are there other reasons beside the obvious reasons (no threading, ...)
for not using PHP as a server?
Yep, lots of socketfunctions are incompatible with each other and it's a hell to debug.
i tried something similar myself and quit frustrated sind every function i thought would make sense didnt do what i expected