Mysql security queries php [closed] - php

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm sorry for the title it might not make good sense to the content.
I've created a code where a query loads to db and selects some rows, then it makes a loop and puts it in an array.
SELECT----
for($i=0;$i<$count;$i++){
$info[] = mysql_fetch_assoc($result);
So this query is really easy and fast.
Now these queries is being done when a visitor enters a certain site in the web.
It all works great, but I never want to trust my visitors.
What if they updated the page several times (with multiple browsers,bots and so on)
Then my queries per second will increase very much. This might lead to a crash! Or can it?
So what I wonder is, how can I get the content, but still not letting the user update the page to many times a second. Cookies check last db connect? Got any ideas for me ? Any advices?

It normally works this way:
define('THIRTY_SECONDS', 30);
if (current_request_cache_time_in_seconds() < THIRTY_SECONDS) {
get_request_response_from_cache();
} else {
get_request_response_from_php_code();
save_request_response_to_cache();
}
As you can see, this is a high-level function meaning that the actual code being executed is controlled on. In this example, the code for a specific page will only be executed once per 30 seconds (unless a race-condition appears).

Related

Laravel Best Practice for save user activity log in database ( with considering database Performance) [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I have a website (Laravel + Mysql on top of 'dedicated server') where I save all the pages that every user sees for reporting.
My site is visited 10,000 times a day and this statistic makes the database size bigger after a few months. now 'visits' table occupied 85% of whole database!
Is there a way to do this that is the best way possible?
I have not encountered this problem before, but I think its better to take the logging with this much of heavy load out of primary database, you can move it to a file system or logging services (read this).
Or you can have job (background process) to remove the logs that you don't need Like logs from a month ago, this will help db a little bit.
Read some best practices

Is it better to store configuration in a file or database? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm currently working on a project that uses MySQL for configuration, but now I'm starting to think it could slow down page loads.
So my question is, would it be better to store configuration options (that are read almost every page load) inside an XML/JSON file, or a MySQL database?
Thanks.
One thing to conside is how much config data there is, and perhaps how often it is likely to change. If the amount of data is small, then saving this in a database (if your not already using a db for anything else), would be overkill, equally maintaining a db for something that gets changed once every 6 months would probably be a waste of resources.
I think this depends on your projects. If you want someone else to configure the application through the UI you can put the configurations into the database.
If its just you and some developers, and changes are not made frequently, put them in a file.

Faster/better: Fetch a few JSON-objects with PHP or JavaScript [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Personally I would do this with PHP but maybe someone has an pro/contra for doing this with JavaScript:
I (will) fetch a few song details (about ten per page load) from Soundcloud
afterwards the statistic (played this week/total, liked this week/total etc.) will be displayed in a table
That's it - nothing special.
I can solve it like this:
let PHP do the stuff and render the whole page at once
This will of course take some time before the page is rendered.
Is there any advantage of doing this with JavaScript? I can only think of:
the page renders faster (with the disadvantage that the results may not be displayed instantly)
I'd say that doing it via JavaScript will actually take longer, as you'll have to connect again. Although, it may give an illusion of "being faster" (though I doubt it'll be noticeable).
I suppose it might depend on your usage cases too.
How are you handling authentication with SoundCloud?
I would honestly go the jQuery route.
Use a $.post() call, and within your success statement, loop through the json and append it to a table. This way, you can do something like pagination, which can update your table without needing to refresh the page.
Do it serverside and pre-load the page. The soundcloud API should not be the bottleneck here so reducing a request or two is worth it.

Best way of creating a record cache ( Memcache in this example ) [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I want to use a caching mechanism to help with performance in a PHP application and came across memcache as what seems like a good option. Then looking into it I found a more general problem with caching...
The problems I see though is that if as in the examples I simply use the SQL statement as a hashed key to the cached results, then there is a possible case of duplicate records. This would occur if SQL A would fetch records 1, 2, 4 and SQL B would fetch records 2, 3 and 6. In this case record 2 would be in both result sets. If I simply retrieve the record and then link to the same record - I'm not really getting any performance increase and would probably slow the application down.
Someone else must have come across this - so I wonder what anyone would suggest as to a better way of managing the cache. Or is it to simple not go for caching and always rely on the database?
Thanks.
Why even worry about this? The point is that the query is cached. If the results of two different queries partially overlap, well, so be it. Unless those results are many many megabytes in size and hence take a lot of space, it doesn't matter. What does matter is that if you want to run the same query again, you don't need to, because the results are already cached.

Is this the quickest way to execute this in PHP? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
When a visitor goes to a index.php file the following code is run. There will be only one match in the DB and then it gets it's role to include the correct page.
However, if nothing found it shows them a 404 page. I tested it and running, and I had in mind performance. In my opinion is better to have this instead of if mysql_num_rows > 0 and if { } else {}
You, with more experience than me, how do you find it ?
I gather the code above is already functioning as intended?
Contacting the database to get the user's role will take significantly longer than almost any PHP code you might write to analyze the result that you get back, so I wouldn't worry so much about the efficiency of the rest of your PHP here.
Only when you're doing a long-running calculation or writing a loop that runs many, many times should you worry about efficiency more than clarity.
If this code is run 50,000+ times a second, then maybe you're right to ask about efficiency, but if it's a few hundred thousand times a day or less, your server will never feel the difference, and your time as a coder is much more valuable than to be used thinking about this kind of optimization.

Categories