Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I need to print large number pages of aging report using php.
Numbers of pages may up to thousand.
Whats the best way to print this report?
- direct send to printer
- css
- pdf
- others?
Any suggestion?
Thank you!
If there are thousands of pages I would recommend using a background task to generate a PDF file.
To do this your frontend can write for example a database entry, that says "todo". Then configure a cron job etc. that calls a php script to check e.g. every minute for a PDF-generator todo.
In your PHP configuration check, that you are allowed to use the "set_time_limit()" function. This function resets the max. execution time of a script. You must call this function in your generator script, before the max. execution time has exceeded. Otherwise your PHP script will be stopped before the work is done.
Another problem may be the PHP memory limit. You can increase the memory limit by setting "memory_limit" to a value that matches your use-case. Try different values to find the memory limit that is high enough to run your script. Also don't save too much data in your PHP scripts to avoid high memory usage.
While the background script is running you can write a file or a data-base entry that could be read by a frontend, that shows the PDF-generator progress. When the generator has finished, you can provide the file for download in a frontend.
Here are some links for further reading:
http://www.php.net/manual/en/book.pdf.php
How to create cron job using PHP?
http://www.php.net/manual/en/info.configuration.php
http://www.php.net/manual/en/function.set-time-limit.php
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I made a database with web interface (PHP) for a store and I made a log system "history page" so every change made on the stock will be saved. I made it with an MySQL table with 5 columns (date / user / action / product / changes), but what if I made a file "log.txt" that change every time a user make an action? Which is better / faster and why?
For text files, unless you create a system that ensures that the file is only written to by one thread at a time (like a logging framework would do) you might run into concurrency issues.
A "SQL table" (MS SQL/MySQL/Postgres etc) would be able to handle many concurrent log messages at once. It is however a bit overkill and as your table grows some queries against that table may slow down your database file size will grow too.
Given your scenario (php web app with a history page) SQL is going to be preferable over a text file.
Both are essentially the same thing right? Both store the data to a file and read the data from the file.
Generally I would say if you need to find specific data within the file a database is going to make accessing that data easier and faster. If you only need to append to the data and then read, for example, the last 1000 lines, a text file is going to be easier and faster.
I would recommend using a log utility to write the log to a text file, if you decide to go that route. The log utility will deal with concurrency issues.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a start_processing("set 1") function which takes 6 hrs to complete set1,
I want to process set 2, set 3 .... how can i process all of them in same time ?? since when i put
<?php
start_processing("set1");
start_processing("set2");
start_processing("set3");
?>
It takes 18 hrs.
I want to complete process in 6hrs for all processing.
Finally i got a solution
I have take curl_multi - it is far better. Save the handshakes - they are not needed every time!
Use curl_multi_init to run the processes in parallel. This can have a tremendous effect.
Unless you are using PHP as Apache-module, you can use pcntl_fork to create several processes of which each processes one function call.
if(pcntl_fork())
start_processing("set1");
else if(pcntl_fork())
start_processing("set2");
else
start_processing("set3");
If you have a varying number of working sets, just put them in an array and loop through it. Just bear in mind that too many processes could overload your system!
Another, more lightweight, option is the use of php pthreads which AFAIK work with Apache, but require installing the corresponding php-extension first.
A third possibility is, as mentioned by sandeep_kosta and Niranjan N Raju, to create one Cronjob for each working set.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I need a script to work forever; that's it, it will generate information on the aground without stopping, downloading stuff and storing info on the database, as well as performing calculations.
It seems possible in PHP like here:
<?php
ignore_user_abort(); // run script in background
set_time_limit(0); // run script forever
$interval=60*15; // do every 15 minutes...
do{
// add the script that has to be ran every 15 minutes here
// ...
sleep($interval); // wait 15 minutes
} while(true);
?>
but it's accepted generally that PHP was not designed with this in mind.
1) Is there any drawback using the PHP way (How to stop this script?) or is there a better language to do this, like C++?
2) What other companies, like google who indexes the web, do?
Long-running scripts are generally a bad idea, as they're scripts for a reason. I'm not sure of a way to stop the PHP script once it runs, but there probably isn't a way to do so (at least practically). I'd recommend making writing a program that'll run on a server computer, using languages more designed to do that kind of work compared to scripts.
A simple C# or Java program can run forever, as long as you don't close it. You can manipulate databases by using the corresponding language's database support.
What you're doing is typically accomplished in PHP using cron jobs.
http://www.serverwatch.com/server-tutorials/a-primer-for-scheduling-cron-jobs-in-linux.html
(or google "cron job" + your OS of choice)
A cron job can be scheduled to execute an arbitrary script every 15 minutes pretty easily.
Designing PHP scripts to run forever is considered bad practices because PHP wasn't ever designed with that in mind.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I am making changes to a script a freelancer made and noticed at the top of his function he has set the timeout limit to 0 so it won't timeout. Is there any implications to doing this? Or should i properly test his script and work out how long it is taking and if it will timeout on the server?
Edit - Adding more detail.
So its a script that creates a temp DB table and populates it by pulling data from a number of different tables, it then outputs it to a csv using mysql outfile.
From the PHP docs:
set_time_limit — Limits the maximum execution time
seconds: The maximum execution time, in seconds. If set to zero, no time limit is
imposed.
http://nl3.php.net/manual/en/function.set-time-limit.php
If you set the time limit to zero, it means the script could run forever without being stopped by the PHP interpreter. Most times this is a bad idea, because if you would have a bug in your code which results in an unending loop, the script will run forever and eat your CPU.
A lot of times I've seen this in code without a good reason, so I assume the developer couldn't think of an appropriate timeout and just set 0.
There are a few valid reasons you would want to set a timeout of 0, for example, a deamon script.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have a PHP script that scrapes data from a government web site and puts it in a MySQL database for easier searching. It works great, but every 6,000 some rows, it stops being able to scrape successfully. I think this is some kind of memory leak in with phpQuery, the library I use to parse the HTML I fetch.
Here are the errors, and as you can see they are all in the phpQuery file. The curious thing is, once it errors out, I can restart the script at the record it started erroring on and it works fine for another 6,000 or so records.
Has anyone ever heard of this happening in phpQuery? Perhaps there are too many phpQuery objects? (I can't find a way to 'close' them)
Alternatively, do you have any suggestions for another way I can do this? At the moment, I have to restart the script manually every 40 minutes or so, and with 500,000 records that definitely adds up.
I've used phpQuery (large scale) and I didn't notice such errors.
Try to reload phpQuery every 1000 rows - just clear all variables and hope that garbage collector would fix the problem.
New answer as I want code styling ;)
two ways of reloading script on the fly after 1000 rows:
on unix hosts:
exec("php __FILE__ &");
by http request:
ignore_user_abort(1);
set_time_limit(0);
... 1000 rows parsed ...
$curl_handle=curl_init();
curl_setopt($curl_handle,CURLOPT_URL,'http://'.$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']);
curl_setopt($curl_handle,CURLOPT_CONNECTTIMEOUT,1);
curl_exec($curl_handle);
curl_close($curl_handle);
To unload all or specified document from memory I use
phpQuery::unloadDocuments($id = null);