How to process strings with embedded variables in Mysql - php

I am in need of some advice on how to best process code to keep things running efficiently while still returning dynamic strings to the user.
In my particular situation, I have a set of string responses that vary based on a set of six variables. Early in the project, it was easy to simply use nested if and switch statements. However, as the project advances, this set of code is becoming massive. I would very much like find a way to store this information in a database, but I am unaware of a way to do so that doesn't open up the database to vulnerabilities.
A sample script might look like the following:
printf("Let's say for a moment here, that I want to have a script that is something like this where, ". $yourname ." is printed as well as other pieces of information stored from a database such as: ".$yourfavoritefood." or ".$timesinceloggedon.". Clearly this output will be different from person to person.");
Now imagine I have hundreds of these variable driven scripts. Is there a way to SECURELY store scripts in a mysql database so that I can query only the script I need given the program driven variables at the time?

You can store the text you create in the database or in a file and use printf with arguments to output the text, http://php.net/manual/en/function.printf.php,
printf("Let's say for a moment here, that I want to have a script that is something like this where, :s is printed as well as other pieces of information stored from a database such as: :s or :s. Clearly this output will be different from person to person.", [$yourname, $yourfavoritefood, $timesinceloggedon);
By taking all parts that are different for different users into parameters you can find a common string which can be stored where you store translations

Related

PHP code-generation

Instead of eval() I am investigating the pros and cons with creating .php-files on the fly using php-code.
Mainly because the generated code should be available to other visitors and for a long period of time, and not only for the current session. The generated php-files is created using functions dedicated for that and only that and under highly controlled conditions (no user input will ever reach those code files).
So, performance wise, how much load is put on the webserver when creating .php-files for instant execution using include() later elsewhere compared to updating a database record and always query a database at every visit?
The generated files should be updated (overwritten) quite frequently but not very frequent compared to how frequently they will be executed
What are the other pro/cons? Should the possibility of the combination of one user overwriting the code files at the same time as others is currently executing them introduce complicated concurrent conflict solving? Using Mutex? Is it next to impossible to overwrite the files if visitors is constantly "viewing" (executing) them?
PS. I am not interested in alternative methods/solutions for reaching "the same" goal, like:
Cached and/or saved output buffers, as an alternative, is out of the question, mainly because the output from the generated php-code is highly dynamic and context-sensitive
Storing the code as variables in a database and create dynamic php code that can do what is requested based on stored data, mainly because I don't want to use a database as backend for the feature. I don't ever need to search the data, query it for Aggregation, ranking or any other data collecting or manipulation
Memcached, APC etcetera. It's not a caching feature I want
Stand-alone (not PHP) server with custom compiled binary running in memory. Not what I am looking for here, although this alternative have crossed my mind.
EDIT:
Got many questions about what "type" of code is generated. Without getting into details I can say: It's very context sensitive code. Code is not based on user direct input but input in terms of choices, position and flags. Like "closed" objects in relation to other objects. Most code parts is related to each other in many different, but very controlled, ways (similar to linked lists, genetic cells in AI-code etcetera) so querying a database is out of the question. One code file will include one or more others, and so on..
I do the same thing in an application. It generates static PHP Code from data in a MySQL database. I store the code in memcached and use ‘eval’ to execute it. Only when something changes in the MySQL database I regenerate the PHP. It saves an awful lot of MySQL reads

Store permanent data on a php file to reduce the number of SQL queries is a good practice?

I´m creating a new website and in the webs I´ve coded before I always have been using a php file to store permanent that is used a lot that is called with an include from the index.php.
The purpose of that is to avoid doing extra queries as this data is used every time and doesn´t change, but now I´ve concerns if that is a good practice.
One example of that is the different languages options that the website has, that is shown everytime and I don´t want to do an extra query every time.
Is that a good practise or it´s better to store everything on the DB and make more queries?
I store my language dependent strings in separate files, grouped by function, for example email. So I would have a template called
lng_email
Then I add a suffix to it in the include statement. For English the example would be
lng_email_en.php
The suffix is fetched at login and stored in the session variable.
Well after 3 years since I posted the question, I end up storing the data the following way:
Almost all the data is stored in the mysql database, where it's easy to manage.
I use the $_SESSION var only for user's data that is being called and used during the user's session only.
I have a php file where I store general data that's being use a lot by the code and won't be changed in the short or mid term. This way I save the code to do several queries to the database. I store them within a function so they can be retrieved by other function and avoiding to declare them as a global. As an example:
function retrieve_privacy_list(){
$privacy_list = array(
1=>'public',
2=>'reduced',
3=>'private');
return $privacy_list;
}

What is the most effective way to store array from PHP to JS/jQuery so it is re-useable

I've crushed in a problem i have never thought about. My case is that I'm making a site where the user can practice his lexical knowledge on particular language. For this reason I have a form where a foreign word is loaded and a translation input expected. Each lesson contains of 20 words.
The problem: I need to get all those words only once and somehow magically store them somewhere so my code can use this array every time the user goes on the next word. I want to get out of connecting the database 20 times for each word.
At this point I receive the array through a ajax function in JSON format
So far I have read of the following solutions (each with its own pros and cons).
1.use a JS local storage
2.store the json in hidden
3.use a global JS variable.
What other options do I have, and which is the most suitable?
I would say your best bet would be to use JSON. Query your database once and load your results into a JSON object. From there you can use the data whenever you need it. This would be more effective than using JS variables, and also should have more support cross-browser.
Copter labs has a pretty good overview of how to use JSON:
http://www.copterlabs.com/blog/json-what-it-is-how-it-works-how-to-use-it/

Whichever make the script more slowly (I have two choices)

I have table in database named ads, this table contains data about each ad.
I want to get that data from table to display ad.
Now, I have two choices:
Either get all data from table and store it in array, and then , I will treat with this array to display each ad in its position by using loops.
Or access to table directly and get each ad data to display it, note this way will consume more queries to database.
Which one is the best way, and not make the script more slow ?
In most Cases #1 is better.
Because, if you can select the data (smallest, needed set) in one query,
then you have less roundtrips to the database server.
Accessing Array or Objectproperties (from Memory) are usually faster than DB Queries.
You could also consider to prepare your Data and don't mix fetching with view output.
The second Option "select on demand" could make sense if you need to "lazy load",
maybe because you can or want to recognize client properties, like viewport.
I'd like to highlight the following part:
get all data from table and store it in array
You do not need to store all rows into an array. You could also take an iterator that represents the resultset and then use that one.
Depending on the database object you use this is often the less memory-intensive variant. Also you would run only one query here which is preferable.
The iterator is actually common with modern database result objects.
Additionally this is helpful to decouple the view code from the actual database interaction and you can also defer to do the SQL query.
You should minimize the amount of queries but you should also try to minimize the amount of data you actually get from the database.
So: Get only those ads that you are actually displaying. You could for example use columnPK IN (1, 2, 3, 4) to get those ads.
A notable exception: If your application is centered around "ads" and you need them pretty much everywhere, and/or they don't consume much memory, and/or there aren't too many adds, it might be better performance-wise to store all (or a subset) of your ads in an array.
Above all: Measure, measure, measure!
It is very, very hard to predict which algorithm will be most efficient. Often you implement something "because it will be more efficient" only to find out later that your optimization is actually slowing down your application.
You should always try to run a PHP script with the least amount of database queries possible. Whenever you query the database, a request must be sent to the database (usually) over the network, and your script will idle until the request came back.
You should, however, make sure not to request any more data from the database than necessary. So try to filter as much in the WHERE clause as possible instead of requesting the whole table and then picking individual rows on the PHP layer.
We could help with writing that SQL query when you tell us how your table looks and how you want to select which ads to display.

AJAX-like Interaction With Stored Procedure?

I think I'm probably looking at this the complete wrong way. I have a stored procedure that returns a (potentially large, but usually not) result set. That set gets put into a table on the web via PHP. I'm going to implement some AJAX for stuff like dynamic reordering and things. The stored procedure takes one to two seconds to run, so it would be nice if I could store that final table somewhere that I can access it faster once it's been run. More specifically, the SP is a search function; so I want the user to be able to do the search, but then run an ORDER BY on the returned data without having to redo the whole search to get that data again.
What comes to mind is if there is a way to get results from the stored procedure without it terminating, so I can use a temp table. I know I could use a permanent table, but then I'd run into trouble if two people were trying to use it at the same time.
A short and simple answer to the question: 'is a way to get results from the stored procedure without it terminating?': No, there isn't. How else would the SP return the resultset?
2 seconds does sound like an awfully long time, perhaps you could post the SP code, so we can look at ways to speed up the query's you use. It might also prove useful to give some more info on your tables (indeces, primary keys... ).
If all else fails, you might consider looking into JavaScript table sorters... but again: some code might help here

Categories