Here is my situation:
I have multiple servers under a load balancer. My users are able to upload files to the server. These files need to modify and sent to another server. So to do that I decided to use Gearman to manage queues of the PHP scripts that will do the work.
Here is the problem. What happens if one server goes down... Then gearman executes the code to get a particular file but that file was on the server that went down. How can I setup gearman to wait for the server that went down to finally get the right file... Also it might happen that the servers might not go down but since gearman talks to the other servers it might execute the code on another machine looking for a file that is on another server.
How can I get around this problem? Without having to isolate each gearman servers from each other. Because I will use it for other things that don't have the same restrains.
Related
I am new to RabbitMQ and expecting suggestion from experts here. I have a lamp based server where a PHP application is running.
I have a programming situation like from a third party remote server a file with some new data will be dropped in my LAMP server in a random manner based on some calculation on the remote server.
Whenever there is a new file dropped in my LAMP server, I need to run a few functions and update my database.
I can do it using CRON in my server but I will need to run it every minute as per need and I do not think this is the best way to do it as the duration of the new file can be a minute or can be a number of days.
I heard by someone, RabbitMQ can help me with this. My expectation is to build a system where I can establish a listener which can detect a new file has been dropped and only then will trigger the PHP function to update my database.
Please help me understand how can I take advantage of RabbitMQ in this situation.
Thanks in advance.
Sanny
you can use rabbitmq for this, if the third party remote server is sending a AMQP message to your LAMP. rabbitmq does not listen to files or anything else.
the best way to solve your problem is, create a webhook in your LAMP server to process the dropped file. (if the file dropped via http post request to your LAMP)
so whenever you have new file, the webhook can handle it.
I have been running a PHP site for years on my own servers. I recently purchased a dedicated server package and am trying to move my site to the dedicated server. I recently upgraded to PHP 5, and my current server is running PHP 5.6.16. I moved the files and the database, and put it in a live test domain, but the site is not functioning properly on the new dedicated server. Several key scripts are non-functional. I made sure that the dedicated server is running a version of 5.6. I have configured it to the same settings I have on the old server. I can see that the site is talking to the MySQL database. I turned on error reporting and I see no significant errors suggesting why these important scripts are now non-functional. I made sure the include path is there, and if it wasn't nothing would work. What am I overlooking? What could be different between one server and the other that might impact PHP functionality? I'm basically at my wits end here, so if these seems stupid please forgive me, but I don't know where to look next.
Start with the basics.
Does your web server respond to static page requests?
Is your web server configured to use PHP?
Can your web server execute and/or connect to PHP?
If you have a simple script with <?php phpinfo(); in it, does it work?
Are all the expected modules there in your phpinfo() output?
Do you have rewrite rules that need to be reconfigured? (Check your web server error log. Check your response status codes.)
Assuming PHP is all good, move into your application.
Are you absolutely sure error logging is on? (Again, check phpinfo() output. Try to force an error, maybe a syntax error or something and see if you see the error.)
How do you know your application is connecting to MySQL?
Start with a basic script that just echos some things.
Comment out large swaths of code and see if you can narrow down the problem that way, re-enabling chunks as you go. (You want to bi-sect the problem, cutting in half and in half and in half until you figure out exactly what the issue is.)
Other system-level things to check...
File system permissions? (See also https://serverfault.com/questions/48587/is-there-a-linux-log-for-when-a-user-is-denied-access-to-files-due-to-permission, for Linux.)
Firewalls? (Are you sure you can actually access MySQL over the network?)
Disk? (Are you out of space? Are your partitions set up correct? Is /tmp full?)
Once you figure out the problem, some advice:
Do this sort of thing regularly. Write a provisioning script to build yourself a new machine from one command, and do it regularly. These days with cloud providers (physical hardware or not) there's no reason you can't blow away your application servers on a regular basis, and re-provision them. Consider making this your system upgrade strategy. (Why reboot to get a new kernel when you can just have a whole new server with the new kernel and other patches, that you can cut over to?)
Ensure your development environment closely matches your production environment. (Consider Vagrant for this.)
You're using version control, right? If not, start using version control so that you can hack on your code for things like this and easily roll back when done.
I've written a web socket server that listens to a specific port. In order to run it I log in to EC2 instance with putty and run:
php server.php
I was wondering if this is the only and the right way to do. Normally copy my php files to the host via ftp would be enough, I don't understand why the php command needs to run the server.
Any help is appreciated.
This question is not about any particular coding problem, so is considered off-topic in terms of StackOverflow.
The way PHP works - is just a script file. Same as bash (.sh), python (.py), node.js (.js) or any other similar.
They all in fact have to be executed. In common world, Apache, nginx or any other web server will do execute those scripts for you for each request is made to web server.
As you are creating socket file, you need to create it yourself, as it creates one socket and php script will continue working as long as it will by it self. It is not executed per each request. In fact make sure it is not executed by apache so do not put in usual website directory.
Our website currently backs up every night to a seperate server that we have which is fine, but when we go to dowload the files the next day it take's a long time to download the files (usually around 36,000+ images). Downloading this the following day takes quite some time and affects the speeds of everyone else using our network so ideally we would try and do this in the middle of the night - except there's no-one here to do it.
The server that the backup is on is running Cpanel which appears to make it fairly simple to run a PHP file as a Cron job.
I'm assuming the following, feel free to tell me I'm wrong.
1) The server the backup is on runs Cpanel. It appears that it shouldn't be too difficult to set up a PHP script to run as a Cron job in the middle of the night.
2) We could deploy a PHP script utilizing the FTP functions to connect to another server and start the backup of these files using this cron job.
3) We are running Xampp on a windows platform. It has Filezilla as part of it so I'm assuming it should be able to accept incoming FTP connections.
4) Overall - we could deploy a script on the backup server that would run every night and send the files back to my local computer running Xampp.
So that's what I'm guessing. I'm getting stuck at the first hurdle though. I've tried to create a script that runs on our local computer and sends a specified folder to the backup server when it executes, but all I seem to be able to find is scripts relating to single files. Although I've some experience of PHP, I haven't touched upon the FTP functions before which are giving me some problems. I've tried the other examples here on stack overflow with no success :(
I'm just looking for the most simplistic form of a script that can transfer upload a folder to a remote IP. Any help would be appreciated.
There is a fair amount of overhead involved in transferring a bunch of small files over FTP. Ive seen jobs take 5x as long, over a local network. It is by far easier to pack the files in something like a zip and send them in one large file.
you can use exec() to run zip from the command line (or whatever compression tool you prefer). After that, you can send it over ftp pretty quickly (you said you found methods for transferring 1 file). For backup purposes, having the files zipped would probably makes things easier to handle, but if you need them unzipped you can setup a job on the other machine to unpack the file.
I developed a chat application with an attendant chat server. Everything is working fine. The issue now is the fact that whenever the chat server goes down (for instance, the server system shuts down as a result of power failure or some other problem), by the time the server system come back on, the chat server would have to be restarted manually.
I believe (and I know) it is more appropriate for the chat server application to restart itself when the computer comes back on (and of course regardless of who is logged in and of course, even before anyone logs in). I have a batch file that executes the chat server. My attempt was to create a windows service that start automatically and runs this batch file using a Network Service account on the server system. Although, I'm having a hard time with this (temporarily), I would love to ask if there are any alternatives to using a windows service. Suggestions are highly appreciated.
Creating a windows service will be the better solution, but you can add your batch file into the startup folder.
I think you are already having the better solution (Wibndows Service). Along with adding an email alert or some sort of alert when the server restarts will be handy (?).
I would probably just start the server using the Windows Task Scheduler. You can set a task to start on system startup: