Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I just have a general question. I am concerned with the speed of my PHP website, which is set to go into production soon.
My concern is the length of time it takes for the page to run a query.
On my page, I have about 14 filters in an HTML form. I am using the method GET to retrieve all the values from the filters. Granted, not all 14 filters have to be used. A user can just search off one filter. Of course, the more filters are selected, the larger the query becomes. But the larger the query becomes, the quicker the page loads. So it's beneficial for the user to select more filters over using just one filter.
All of the filter values are then sent to an INCLUDED PHP file, which then builds a query based off of the user's filtered selection.
The query runs and I am able to print the selected data into an HTML table on the original page. The problem is the it can take quite some time for the page to render and finally display the data-table.
The database is not too large. Maybe between 20K - 40K records, though there are over 20 columns per record.
When I run the same query in MySQL, it returns the data faster than it does on the page.
Here is where I believe the problem might lie.
Within the form are the filters. About 5-6 of the filters are running queries themselves to populate the selection data for the user.
I believe that after the user runs a query, the page refreshes and it has to re-run all the filter queries within the form.
If this is the case, what steps can I take to fix this issue? If any. Should I place all of the filters in a separate file and INCLUDE them within the form? If not, then please advise what I can do to speed up the page loading process.
I have visited various websites in an attempt to fix this issue. Like this one:
http://code.tutsplus.com/tutorials/top-20-mysql-best-practices--net-7855
I am following just about every step suggested by that site, but I am still experiencing the page load delay.
I hope I can receive some positive insight.
Thank you in advance.
What you can do is if all the filters are static and do not disappear or change view when selected / changed value you can set the filters outside of the reload view.
Currently I am building a site that is dealing with AJAX query reload and have to deal with a very similar aspect. My fields are set outside of the reload and I have very fast load times.
If they are dynamic or need to change based on options chosen then I would set them as a separate reload. Basically determining which ones changed vs what needs to be displayed.
Hopefully this helps and explains well enough.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am working on a project for a client to make their roster viewing and entry paper-free. Some things rely on user input, but others are varied constants and are pulled from a database. One of the dropdowns pulling its information from said database is loading very slowly in my testing environment - I click it, and it takes two to three seconds for the items to actually show up.
I assume this is because the dropdown contains easily close to 400 items. What I'd like to know is if there is any way I can optimize this to make the loading faster. Cutting off at halfway or even a fourth of the way and making more dropdowns is outside of specification for the project, so a suggestion like that would have to go through the client for the go ahead. If that's the only solution, though, I'll do that.
Thanks in advance for suggestions.
EDIT:
This is the kind of oversight that makes me laugh, but I was wrong about my conservative estimate of 400 items - the list contains 12,700 items.
For all who assume the SQL query may be slow - I am doing a simple SELECT DISTINCT from one table, with only one WHERE condition.
Before this question can be answered well, you'll need to do more testing and debugging to figure out what exactly is slowing things down.
Is this just a problem with your machine running something slowly?
Test things out on a real webserver and see how things go.
Is the select from the database slow? If so, that's a problem,
selects from mysql should be really fast.
Is the php slow? If so, you can use an IDE like PhpStorm to see
exactly what functions take how long to run.
Is the rendering in the client browser slow? If so, we can look at
optimizing the javascript/html.
As noted in a comment above, we also need to know when you are getting data from the database (before the page loads, or ajax). So before you can really look at how to speed this up, you need to know what is taking so long. Start there. Also make sure to check your javascript and html for any errors. Copy/paste the entire page's html into here, and fix all errors and warnings: http://validator.w3.org/#validate_by_input For javascript, check the debugger console.
You could use an infinite scroll to break up the amount of data being returned. Take a look at this answer:
infinite scroll select
This may not be the underlying issue though. I would check the performance of your query to ensure that is not where the issue lies.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I have a fixed (rarely changing) list of ~100 words. I want to display a random word on my HTML page on every reload. Should I hardcode the words as an array in the PHP script, or should I put them into a MySQL table and pull a random entry from there? What are the possible performance/maintainability considerations here?
It depends, if you ever want to easily manage these words or have someone else manage them, I would go for putting them into a database. Using a database has an extremely high overhead relative to a PHP array, although it is likely unnoticeable for a human if hosted locally.
I would not use anything other then a PHP array, database table, or text file though. I think that even a text file is a little bit extraneous and shouldn't be used - if you would want it in a text file it's probably just best to put it in a database.
My take is that it depends on these factors:
How rarely is "rarely"? Like once every year? Or maybe once every month?
How many requests are you getting, and how many are you predicting?
Do you have an established development/deployment cycle? Meaning, steps between changing some code and actually updating in production servers.
Do you have direct access to the databse? Or would you have to set up and admin tool in order to edit the list?
I would favor a non-MySQL scenario if you don't use the DB for anything else, or if you are getting millions of requests per day, so as not to add millions of queries for such a simple objective. Maybe using a local file with the words would suffice, if you have relatively straightforward access to the filesystem.
I wouldn't go for MySQL as its just not needed for a non-changing set of words. If you plan to change them whatsoever go for a CSV file [using implode() and explode() to manage it] or if you are very rarely or never changing them then a PHP Array would be best for performance, with 0 maintenance.
If you wanted to change the words via a nice interface you were going to write, I'd store them in MySQL. If they rarely change and it's just as easy to update your code as it is the database, then you might as well just store them in a PHP array.
I will say to use hardcode as php array instead of mysql and other connectivity part for these so it will be easy and you are saying you will use it rarely.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
i have a blog system which has articles inside a database, now i want to build a feature where it displays five of the most popular articles in the database, according to how many views it gets.
Is there any sort of technology out there which i can take advantage of where it states how many views a page has received, and IS able to be integrated into a database.
Or perhaps there is a better internal method of doing something like this?
Thanks in advance.
EDIT: If you are going do down vote my thread randomly, at least tell me why.
You have three choices as an approach for this obviously:
you collect the usage count inside your database (a click counter)
you extract that information from the http servers access log file later
you could implement a click counter based on http server request hits
Both approaches have advantages and disadvantages. The first obviously means you have to implement such a counter and modify your database scheme. The second means you have asynchronous behavior (not always bad), but the components depend on each other, so your setup gets more complex. So I would advise for the first approach.
A click counter is something really basic and typical for all cms/blog systems and not that complex to implement. Since the content is typically generated dynamically (read: by a script) you typically have one request per view, so it is trivial to increment a counter in a table recording views of pages. From there your feature is clear: read the top five counter values and display a list of five links to those pages.
If you go with the second approach then you will need to store that extracted information, since log files are rotated, compressed, archived and deleted. So you either need a self tailored database for that or some finished product. But as said: this approach is much more complex in the end.
The last option is nothing I saw, it just sprang to my mind. You could for example use phps (just as an example) auto append feature to run a counting routine in a generic way. That routine could interpret the request url, decide if it was a request to an article view. If so it could raise a click counter, typically in a small database, since you might have several requests at the same time, which speaks against using a file. but why make things that xomplex? Go with the first option.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I'm working on a POST register system. Simply every single post request that is sent to the site must be recorded as well as everything the user had posted ( in other words the $_POST array ).
I see 2 ways of doing this:
The right way - having a separate table registerPostInfo for post information where every single element of the array will be inserted as a new record.
The wrong way - creating an additional column to my registerPost table which would hold the json_encode()'d post array.
I'm asking for advice because eventhough it may be considered 'WRONG' I honestly think I'll be better off with the second solution, because this table gets flooded like crazy. I have made 2000 records all by myself in a one month testing period on a local server, if I were to proceed with the first solution, say there are an average of 5 elements in the post array, this means there were going to be 10000 records in the registerPostInfo table. Imagine that with thousands of people.. I'll be happy for any useful information about my issue and possibly a third way I haven't thought of.
Thanks in advance!
Depends on what the actual purpose of “recording” all the posted data is. If you just want this as a kind of log so that you can reconstruct later on what a user was posting should it turn out to be malicious or unwanted, then I’d say storing it as JSON or serialized into a single column is totally OK; whereas if you want to process that data in general at some point, maybe even search in it for certain parameter names/values, then you might be better off with storing it as single parameter_name|value records, all tied together by an id for each single POST request made.
So if the main purpose is not actually working with that data constantly, but only to “analyze” it when necessary, I’d go with serialized data. Easy enough to select it from the database by time of posting or by user id – and then the de-serializing part could be done by a script. And your secondary use, showing to the user what kind of content they have created – well that you should be able to get from the tables that actually hold that content.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm having to build a directory type website that will be fairly feature intensive on the search side of things.
It will have a lot of Doctors in it (Doctors' names, addresses, biographys, specialties, etc). All of this will be stored in a database. There will be a few preferred member doctors, who, when showing up as a search result, will be at the top of the search above all others.
I need the user to be able to search by entering their query into ONE field, with the option of adding their zip code to a second field by the search (but is not required). If they add their zip code, the search will bring up doctors closest to them first (like a Google local search). If they don't enter their zip code, can I geo-target them and still use the same feature to show doctors closest to them first?
I'm entering all of the doctors information into a database table, because I don't want to have to manually create a page for each doctor that would take forever as there are over 4,000 doctors. I'm going to just create a page they can each log onto to type their info into fields and hit submit and have that place it in the tables for me. (this I'm fine with, I can handle).
Can anyone suggest a search engine or search tool/program I can use that I can tweak to get this all to work?
I've used Sphider on several sites and really like it, but I can't imagine how I could use it and get all of these features. I know this is a very ambitious project, so any help anywhere is invaluable.
Thank you all! Wish me luck!
I just need to understand one thing, as you said, the doctors data is listed a table of your database, so why you need an extra or third party tool to search? You are able to write a search script for your table easily.
As I understood too, your table includes about 4000 records, I have an application has a table includes 6236 records, Verses of the Quran and I manage my own search through it easily.