Include an array of 20k zip codes or call MySQL [closed] - php

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a list of about 20,000 zip codes that I need to check against. Should I store them in an PHP file as an array? How much memory would that occupy?
Or should I call MySQL every time to check against its database table to see if it exists? Which way is faster? I assume the first option should be faster? The connection to database alone may slow down the database call option quite significantly? I'm just a bit concerned about that memory problem if I do it by including PHP file on every call.

Databases are specifically designed to store and search through large amounts of data efficiently and quickly. If you were to put a 20,000 element array in every PHP file it would drastically slow down every page load, even when the array wasn't being used.

Related

Efficien memory and cpu usage with progressive read and write [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
Suppose i have fetch more data rows in laravel and i have to write in csv file. And suppose my server execution time is out before all data rows are written in csv file. Then how can i proceed in order to be able to write all the data row in csv file ?
Actually i wanted to fetch data rows, clean memory and read and write data in the csv file but have no clue in laravel.
Please guys help me with any reference or ideas.
Create a file and having the file location stored somewhere.
Limit the records fetched once using the query.
e.g.
get the count of your required records and execute a loop according to that say 100 records at a time and write/append those records in your CSV file.
Alternative Way (Simplest Way)
use Maatwebsite/Laravel-Excel library to create CSV, excel etc.

Session variables or new sql queries [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I was wondering which is the most efficient. Assigning 7 session variables when the user is logged in or passing the user ID and making new sql queries when the information is needed. I want to cater for mobile users with low download allowanaces as well and we don't have free wifi around here.
PHP is all done server side so I don't think there are any considerations here for mobile users, other than the usual ones such as keeping the page downloads small and accessible. In other words, the server resources are the bottleneck here, so go whichever way you like! Generally speaking storing values (caching) is going to be faster in terms of processing, but could use more memory than fetching stuff as you need it.

Need to handle large database [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am having table with 9 columns and 400000 Records. I am using php and mysql for database. The problem I am facing is it takes quite a long time to fetch the particular data or search the records. So can anyone please suggest me should I use other database or some twicks to do in database and also sugegst me the best hosting to handle this large records in my site.
this much record is not considered as a large data. What you need to do is make sure you have proper indexing in your table columns and most important to load only those data which are required. i.e. Implement paging.

Getting whole MySQL table into PHP array VS reading row by row [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
What are the drawbacks of reading, let's say up to 10000 rows table into PHP array VS reading and processing row by row?
Assuming your data doesn't use up all your memory, I'd say there is no point in using arrays. When you send a query to server and get the result, you have the entire result set in memory anyway, just on the libmysql(dll|so) client memory space instead of PHP memory space. Fetching rows from there one by one is quite fast, since libmysql is compiled and highly optimized, while php is an interpreted language. The difference may not be immediately apparent on small results, but on big ones, you will notice.

Updating a MySQL table every couple of seconds with large amounts of data? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I've create an application that has an autosave feature. So instead of the user having to click a save button, their settings are automatically saved with every interaction with app. Behind the scenes, every time the app changes, I POST the data to PHP and update a MySQL database table.
It's working well but nobody's using it yet so my question is: If I'm updating a MySQL database table with this save data (and the saved data could be the equivalent of a 100kb XML file) every couple of seconds, could I experience performance issues? It should be noted that there could be hundreds or thousands of users using the app at the same time.
Any tips or advice would be appreciated.
Bundle up and serialize all your data changes into a single JSON object before POSTING (as a single field). Fewer large interactions will offer better performances than constant tiny ones.

Categories