Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I need a script to work forever; that's it, it will generate information on the aground without stopping, downloading stuff and storing info on the database, as well as performing calculations.
It seems possible in PHP like here:
<?php
ignore_user_abort(); // run script in background
set_time_limit(0); // run script forever
$interval=60*15; // do every 15 minutes...
do{
// add the script that has to be ran every 15 minutes here
// ...
sleep($interval); // wait 15 minutes
} while(true);
?>
but it's accepted generally that PHP was not designed with this in mind.
1) Is there any drawback using the PHP way (How to stop this script?) or is there a better language to do this, like C++?
2) What other companies, like google who indexes the web, do?
Long-running scripts are generally a bad idea, as they're scripts for a reason. I'm not sure of a way to stop the PHP script once it runs, but there probably isn't a way to do so (at least practically). I'd recommend making writing a program that'll run on a server computer, using languages more designed to do that kind of work compared to scripts.
A simple C# or Java program can run forever, as long as you don't close it. You can manipulate databases by using the corresponding language's database support.
What you're doing is typically accomplished in PHP using cron jobs.
http://www.serverwatch.com/server-tutorials/a-primer-for-scheduling-cron-jobs-in-linux.html
(or google "cron job" + your OS of choice)
A cron job can be scheduled to execute an arbitrary script every 15 minutes pretty easily.
Designing PHP scripts to run forever is considered bad practices because PHP wasn't ever designed with that in mind.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am about embarking on this massive project. Using php for the server side , mysql for the database. It's really important that some part of this application keeps running on the background , algorithms , the database being updated , etc. Even if no one is on the system. How do I achieve this ??
From personal research, I found out that I can create a from job. But I've not really used that before , and is there a way a from job can be made to run on an endless loop ?
Make a php file which you want to run in background
say automate.php
Make a cron job which runs at every 30 mins(JUST EXAMPLE) using crontab -e and edit like below
0/30 * * * * /path/php automate.php
Make sure php is defined in your PATH
#olu The easiest way to get PHP code to run over an over again is to put it inside of a loop that never ends and run the command in headless mode. You could have a php loop that looks like:
public function loop(){
//Do you stuff here.
sleep(30); // Wait for 30 seconds.
loop();
}
loop();
When you run a file that calls this loop it will never end. This will achieve what you want but isn't the best way to perform this task.
On unix/linux there is a job/task scheduler named cron. http://www.unixgeeks.org/security/newbie/unix/cron-1.html
It can be a bit tricky to learn at first but there are lots of examples on how to schedule a cron job. Ideally this would be a better solution then running a headless loop.
I hope this helps.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a start_processing("set 1") function which takes 6 hrs to complete set1,
I want to process set 2, set 3 .... how can i process all of them in same time ?? since when i put
<?php
start_processing("set1");
start_processing("set2");
start_processing("set3");
?>
It takes 18 hrs.
I want to complete process in 6hrs for all processing.
Finally i got a solution
I have take curl_multi - it is far better. Save the handshakes - they are not needed every time!
Use curl_multi_init to run the processes in parallel. This can have a tremendous effect.
Unless you are using PHP as Apache-module, you can use pcntl_fork to create several processes of which each processes one function call.
if(pcntl_fork())
start_processing("set1");
else if(pcntl_fork())
start_processing("set2");
else
start_processing("set3");
If you have a varying number of working sets, just put them in an array and loop through it. Just bear in mind that too many processes could overload your system!
Another, more lightweight, option is the use of php pthreads which AFAIK work with Apache, but require installing the corresponding php-extension first.
A third possibility is, as mentioned by sandeep_kosta and Niranjan N Raju, to create one Cronjob for each working set.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
It is possible to define variable in PHP and call it for all users connected on server?
I need variable, or object for store informations in RAM of the server without using database or server file system.
Save the data to the variable in one computer, and call them back in another connected computer.
What is the best practice, is it possible?
Roughly - yes, it is possible.
In order to do that you need to have access to RAM which I haven't seen in PHP done directly, not sure if is possible or not, you can research this yourself.
What you can do is, however, since PHP uses memory to run, you can take advantage of that and create a php script that will run forever and act as a server, that is going to use it's ability to write and read memory and is going to be an amazingly simple job since PHP handles that for you automatically and you would not have to bother with addresses and stuff ( describing a simple variable declaration ). In order to access this running script you will need to examine how sockets work and how to establish a server-client connection. That is very well explained in this article.
However, I do not mean to be rude, but by the way you form your question I can make an assumption that this may be too much for you, so I guess what you can do is use MemcacheD or any other in-memory caching mechanism that is already built by people better at coding than me and you. There is plenty of information out there, just search for in-memory caching mechanisms.
Good luck!
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I need to print large number pages of aging report using php.
Numbers of pages may up to thousand.
Whats the best way to print this report?
- direct send to printer
- css
- pdf
- others?
Any suggestion?
Thank you!
If there are thousands of pages I would recommend using a background task to generate a PDF file.
To do this your frontend can write for example a database entry, that says "todo". Then configure a cron job etc. that calls a php script to check e.g. every minute for a PDF-generator todo.
In your PHP configuration check, that you are allowed to use the "set_time_limit()" function. This function resets the max. execution time of a script. You must call this function in your generator script, before the max. execution time has exceeded. Otherwise your PHP script will be stopped before the work is done.
Another problem may be the PHP memory limit. You can increase the memory limit by setting "memory_limit" to a value that matches your use-case. Try different values to find the memory limit that is high enough to run your script. Also don't save too much data in your PHP scripts to avoid high memory usage.
While the background script is running you can write a file or a data-base entry that could be read by a frontend, that shows the PDF-generator progress. When the generator has finished, you can provide the file for download in a frontend.
Here are some links for further reading:
http://www.php.net/manual/en/book.pdf.php
How to create cron job using PHP?
http://www.php.net/manual/en/info.configuration.php
http://www.php.net/manual/en/function.set-time-limit.php
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I would like to be able to make synchronous server requests for a game I plan on making. I've used AJAX's synchronous calls in the past sparingly because they are slow and I know that AJAX isn't cut out for this sort of task.
My reason for this is because I want it to be (as much as possible) hack-proof. For example, if a person buys an item. It will send a request to the server that they wish to buy the item and on that end it will check if they have enough of the currency and then send back if it's OK to allow them to buy it. Using this example, it'd obviously be a pain if it took several seconds each time you try to buy something.
The client side would be HTML5/JS and the server side would be PHP/SQL. What would be the best method to achieve this? Before anyone says "show me your code": I'm not asking for help on fixing something broken. I'm asking for a suggestion on the best way to access a server quickly and synchronously. If it isn't possible faster, then such an answer would suffice.
I'd start by building the most basic approach: simple PHP script with minimal dependencies that only loads what is required to validate the request, connect to database, and make the query so it can return the result.
Then test its performance. If that's insufficient, I'd start looking at websockets or other technology stacks for just the fast-access portions (maybe node.js?).
Making the request run synchronously doesn't make it faster. It just means that the browser is going to become unresponsive until the request is complete. Making it fast is a matter of writing your server-side code such that it can run quickly, and without any details we can't tell you how to do that.