Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 23 days ago.
Improve this question
I have a users array having more then 16k records. So I need to process users array to send push notification but every time it sucks server execution time.
Please help me how to process array data something like delayed jobs so server will not sucks.
I tried to use array chunk but it still not work is there any way to use something like delayed job.
Try to use the sleep function sleep()
After, let's say 500 values, sleep(1).
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
my fetchall fails, since i have more than 10 000 000 rows in the result.
How can we treat bigger result by fetchall in this case ?
any ideas ?
thank you in advance
Tony
As the others said it is a very bad idea to fetch so many datasets at once.
You either
need to restrict your result or
loop through the database by fetching a batch, processing it and fetching the next batch.
A while loop should be your friend.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I made a platform to remind me of some things I haven't done.
I've stored this reminds into a database and now I want this PHP based Website to send me emails.
I want to make tests to the database every morning and if I found records that are critical to send notifications to my email.
How can I do this ? Thank you :)
You have to implement cron job for this. For more details about cron job visit following URL:- https://stackoverflow.com/a/30872993/2706551
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a list of about 20,000 zip codes that I need to check against. Should I store them in an PHP file as an array? How much memory would that occupy?
Or should I call MySQL every time to check against its database table to see if it exists? Which way is faster? I assume the first option should be faster? The connection to database alone may slow down the database call option quite significantly? I'm just a bit concerned about that memory problem if I do it by including PHP file on every call.
Databases are specifically designed to store and search through large amounts of data efficiently and quickly. If you were to put a 20,000 element array in every PHP file it would drastically slow down every page load, even when the array wasn't being used.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am having table with 9 columns and 400000 Records. I am using php and mysql for database. The problem I am facing is it takes quite a long time to fetch the particular data or search the records. So can anyone please suggest me should I use other database or some twicks to do in database and also sugegst me the best hosting to handle this large records in my site.
this much record is not considered as a large data. What you need to do is make sure you have proper indexing in your table columns and most important to load only those data which are required. i.e. Implement paging.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
What are the drawbacks of reading, let's say up to 10000 rows table into PHP array VS reading and processing row by row?
Assuming your data doesn't use up all your memory, I'd say there is no point in using arrays. When you send a query to server and get the result, you have the entire result set in memory anyway, just on the libmysql(dll|so) client memory space instead of PHP memory space. Fetching rows from there one by one is quite fast, since libmysql is compiled and highly optimized, while php is an interpreted language. The difference may not be immediately apparent on small results, but on big ones, you will notice.