I'm experiencing a strange problem. I'm caching the output of a query using memcache functions in a file named count.php. This file is called by an ajax every second when a user is viewing a particular page. The output is cached for 5 seconds, so within this time if there will be 5 hits to this file i expect the cached result to be returned 3-4 times atleast. However this is not happening, instead everytime a query is going to db as evidenced from a echo statement, but if the file is called from the browser directly by typing the url (like http://example.com/help/count.php) repeatedly many times within 5 seconds data is returned from cache (again evidenced from the echo statement). Below is the relevant code of count.php
mysql_connect(c_dbhost, c_dbuname, c_dbpsw) or die(mysql_error());
mysql_select_db(c_dbname) or die("Coud Not Find Database");
$product_id=$_POST['product_id'];
echo func_total_bids_count($product_id);
function func_total_bids_count($product_id)
{
$qry="select count(*) as bid_count from tbl_userbid where userbid_auction_id=".$product_id;
$row_count=func_row_count_only($qry);
return $row_count["bid_count"];
}
function func_row_count_only($qry)
{
if($_SERVER["HTTP_HOST"]!="localhost")
{
$o_cache = new Memcache;
$o_cache->connect('localhost', 11211) or die ("Could not connect to memcache");
//$key="total_bids" . md5($product_id);
$key = "KEY" . md5($qry);
$result = $o_cache->get($key);
if (!$result)
{
$qry_result = mysql_query($qry);
while($row=mysql_fetch_array($qry_result))
{
$row_count = $row;
$result = $row;
$o_cache->set($key, $result, 0, 5);
}
echo "From DB <br/>";
}
else
{
echo "From Cache <br/>";
}
$o_cache->close();
return $row_count;
}
}
I'm confused as to why when an ajax calls this file, DB is hit every second, but when the URL is typed in the browser cached data is returned. To try the URL method i just replaced $product_id with a valid number (Eg: $product_id=426 in my case). I'm not understanding whats wrong here as i expect data to be returned from cache within 5 seconds after the 1st hit. I want the data to be returned from cache. Can some one please help me understand whats happening ?
If you're using the address bar, you're doing a GET, but your code is looking for $_POST['...'], so you will end up with an invalid query. So for a start, the results using the address bar won't be what you're expecting. Is your Ajax call actually doing a POST?
Please also note that you've got a SQL injection vulnerability there. Make sure $product_id is an integer.
There are many problems with your code, first of all you always connect to the database and select a table, even if you don't need it. Second, you should check $result with !empty($result) which is more reliable as just !$result, because it's also covers empty objects.
As above noted, if the 'product_id' is not in the $_POST array, you could use $_REQUEST to also cover $_GET (but you shouldn't, if you are certain it's coming via $_POST).
Related
I have a file that has the function of importing data into a sql database from an api. A problem I encountered was that the api can only retrieve a max dataset size of 1000, even though sometimes I need to retrieve large amounts of data, ranging from 10-200,000. My first thought was to create a while loop in which inside I make calls to the api until all of the data is properly retrieved, and afterwards, can I enter it into the database.
$moreDataToImport = true;
$lastId = null;
$query = '';
while ($moreDataToImport) {
$result = json_decode(callToApi($lastId));
$query .= formatResult($result);
$moreDataToImport = !empty($result['dataNotExported']);
$lastId = getLastId($result['customers']);
}
mysqli_multi_query($con, $query);
The issue I encountered with this is that I was quickly reaching memory limits. The easy solution to this is to simply increase the memory limit until it was suffice. How much memory I needed, however, was undeclared, because there is always a possibility that I need to import very large datasets, and can theoretically always run out of memory. I don't want to set an infinite memory limit, as the problems with that are unimaginable.
My second solution to this was instead of looping through the imported data, I could instead send it to my database, and then do a page refresh, with a get request specifying the last Id I left off on.
if (isset($_GET['lastId'])
$lastId = $_GET['lastId'];
else
$lastId = null;
$result = json_decode(callToApi($lastId));
$query .= formatResult($result);
mysqli_multi_query($con, $query);
if (!empty($result['dataNotExported'])) {
header('Location: ./page.php?lastId='.getLastId($result['customers']));
}
This solution solves my memory limit issue, however now I have another issue, being that browsers, after 20 redirects (depends on the browser), will automatically kill the program to stop a potential redirect loop, then shortly refresh the page. The solution to this would be to kill the program yourself at the 20th redirect and allow it to do a page refresh, continuing the process.
if (isset($_GET['redirects'])) {
$redirects = $_GET['redirects'];
if ($redirects == '20') {
if ($lastId == null) {
header("Location: ./page.php?redirects=2");
}
else {
header("Location: ./page.php?lastId=$lastId&redirects=2");
}
exit;
}
}
else
$redirects = '1';
Though this solves my issues, I am afraid this is more impractical than other solutions, as there must be a better way to do this. Is this, or the issue of possibly running out of memory my only two choices? And if so, is one more efficient/orthodox than the other?
Do the insert query inside the loop that fetches each page from the API, rather than concatenating all the queries.
$moreDataToImport = true;
$lastId = null;
$query = '';
while ($moreDataToImport) {
$result = json_decode(callToApi($lastId));
$query = formatResult($result);
mysqli_query($con, $query);
$moreDataToImport = !empty($result['dataNotExported']);
$lastId = getLastId($result['customers']);
}
Page your work. Break it up into smaller chunks that will be below your memory limit.
If the API only returns 1000 at a time, then only process 1000 at a time in a loop. In each iteration of the loop you'll query the API, process the data, and store it. Then, on the next iteration, you'll be using the same variables so your memory won't skyrocket.
A couple things to consider:
If this becomes a long running script, you may hit the default script running time limit - so you'll have to extend that with set_time_limit().
Some browsers will consider scripts that run too long to be timed out and will show the appropriate error message.
For processing upwards of 200,000 pieces of data from an API, I think the best solution is to not make this work dependant on a page load. If possible, I'd put this in a cron job to be run by the server on a regular schedule.
If the dataset is dependant on the request (for example, if you're processing temperatures from one of 1000s of weather stations - the specific station ID to be set by the user), then consider creating a secondary script that does the work. Calling and forking the secondary script from your primary script will enable your primary script to finish execution while your secondary script executes in the background on your server. Something like:
exec('php path/to/secondary-script.php > /dev/null &');
I have a MySQL stored procedure that updates data across a set of tables (basically for one record in the principal table and related records in a set of child tables). It's called via AJAX through a PHP function. (That is, the AJAX call is to a PHP page, which ultimately calls this SP.) It was working fine, but now I'm trying to make it do one more thing and running into the "Commands out of sync; you can't run this command now" error.
The change is to store one more item in the principal table, but to do so may require adding an item to a child table (called ActionTopic). The page lets the user either choose from a dropdown or type in a new value. I've added two parameters to the SP: one is the PK chosen in the dropdown, the other is the new value typed in. In the SP, I've added the code below. It checks whether there was a new value typed in. If so, it calls another SP that checks whether the value typed in is already in the table and, if not, adds it. (I've tried with the code to check and add the record inline rather than in a separate SP and I have the same problem.)
if cNewTopic <> '' then
-- First, make sure the new topic isn't already there
call aoctest.AddActionTopic(cNewTopic);
-- SELECT #iTopicID := iID FROM ActionTopic WHERE UPPER(Description) = UPPER(cNewTopic);
SET #iTopicID = LAST_INSERT_ID();
else
SET #iTopicID = Topic;
end if;
The page works if the user makes a choice from the dropdown. The problem only occurs when the user types in a new value. Even when I get the error, everything else works as expected. The new value is added to the child table, and the parent table and other children are updated as expected.
Interestingly, if I call the SP in MySQL Workbench with the same parameters (after ensuring that the new value isn't in the new table), it runs without error. The only odd thing I've noticed is that I get two rows in the Output section of MySQL Workbench rather than one. Both show the call to the SP. The first shows "1 row(s) returned" and a period of time, while the second shows "0 row(s) returned" and "-/0.000 sec". A call to the SP in MySQL Workbench where the new value is already in the table also shows two rows in the Output section, but the second one shows "1 row(s) returned".
Not sure whether any of the other code is needed here. If you think I need to show more, please ask.
UPDATE: Based on the comment from Pete Dishman, I took a harder look at where the error is occurring. It's not the original SP call giving an error. It's the next call to MySQL, which is still inside the Ajax call.
The code processing the result already had this code:
//not sure why next line should be needed.
mysqli_next_result($conn);
I tried both simply doubling the call to mysqli_next_result (that is, two in a row) and putting it into a loop along the lines Pete suggested. With two calls, I still get the same error. With a loop, I wait 30 seconds and then get error 500: Internal server error.
UPDATE 2: I tried with a loop for mysqli_more_results() (similar to the one in Pete Dishman's reply) and echoing a counter inside the loop. The code brought my internet connection to a crawl and I finally had to break out of it, but there were dozens of iterations of the loop. Just tried the following:
mysqli_free_result($result);
$result = mysqli_store_result($conn);
mysqli_free_result($result);
if (mysqli_more_results($conn)) {
$result = mysqli_store_result($conn);
mysqli_free_result($result);
}
$allresult = getsubmissions($conn);
Found a noticeable delay before it failed.
Even if you can't tell me what's wrong, I'd appreciate ideas for how to debug this.
This may be because the stored procedure is returning multiple result sets (the two rows in workbench).
When querying from php you need to retrieve both result sets before you can send another query, otherwise you get the "commands out of sync" error.
The documentation seems to imply this is only the case when using mysqli_multi_query but we have it in our code when using mysqli_real_query.
Our query code is along the lines of:
mysqli_real_query($conn, $sql);
$resultSet = mysqli_store_result($conn);
while (!is_null($row = mysqli_fetch_array($resultSet, MYSQLI_BOTH)))
{
$results[] = $row;
}
mysqli_free_result($resultSet);
// Check for any more results
while (mysqli_more_results($conn))
{
if (mysqli_next_result($conn))
{
$result = mysqli_store_result($conn);
if ($result !== FALSE)
{
mysqli_free_result($result);
}
}
}
return $results;
The code would be different obviously if you're using PDO, but the same principle may apply (See http://php.net/manual/en/pdostatement.nextrowset.php)
I've solved my own problem by reworking the code that processes the result as follows:
if (mysqli_more_results($conn)) {
mysqli_free_result($result);
mysqli_next_result($conn);
$result = mysqli_store_result($conn);
mysqli_free_result($result);
if (mysqli_more_results($conn)) {
mysqli_next_result($conn);
$result = mysqli_store_result($conn);
if (!is_null($result) and gettype($result)!== 'boolean') {
mysqli_free_result($result);
}
}
}
$allresult = getsubmissions($conn);
I've been searching for a suitable PHP caching method for MSSQL results.
Most of the examples I can find suggest storing the results in an array, which would then get included to page. This seems great unless a request for the content was made at the same time as it being updated/rebuilt.
I was hoping to find something similar to ASP's application level variables, but far as I'm aware, PHP doesn't offer this functionality?
The problem I'm facing is I need to perform 6 queries on page to populate dropdown boxes. This happens on the vast majority of pages. It's also not an option to combine the queries. The cached data will also need to be rebuilt sporadically, when the system changes. This could be once a day, once a week or a month. Any advice will be greatly received, thanks!
You can use Redis server and phpredis PHP extension to cache results fetched from database:
$redis = new Redis();
$redis->connect('/tmp/redis.sock');
$sql = "SELECT something FROM sometable WHERE condition";
$sql_hash = md5($sql);
$redis_key = "dbcache:${sql_hash}";
$ttl = 3600; // values expire in 1 hour
if ($result = $redis->get($redis_key)) {
$result = json_decode($result, true);
} else {
$result = Db::fetchArray($sql);
$redis->setex($redis_key, $ttl, json_encode($result));
}
(Error checks skipped for clarity)
I have 2 php pages: query.php and result.php.
In query.php, I am executing a query (select) statement. It's returning a resultset
$rs = mysql_query($query);
Now I want to return this resultset from query.php to another page result.php and work with it. Like this:
In query.php:
return $rs
and in result.php:
$result = executeQuery($query) // we get the resultset in this variable
while ($row == mysql_fetch_array($result){
//do something
}
If the above is not recommended, please provide me with alternatives. But I want the query function and resultset in different pages.
You could just include results.php in your query.php page if you're just looking to keep the code separate in the source files but aren't actually required to redirect from one page to another:
In query.php:
$rs = mysql_query($query);
include "results.php";
In results.php:
while ($row == mysql_fetch_array($rs){
//do something
}
As far as trying to "return $rs" from one page to another that's not how PHP works. The return statement is only valid within a function. If you want to actually pass data from one PHP page to another and will be redirecting to that other page then you'll need to use either a session, a cookie, pass it in the URL (i.e. use GET) or use curl and add it as a POST var.
If this is really the way it must be, store the result set in a database somewhere or in a file and give each result a unique name. Then pass that name to the next page so it can be retrieved.
query.php will redirect to result.php?result_set=ab24sdfsdfklls for instance.
This has the added advantage that you can use the result_set as often as you want. Visitors can have multiple result sets during one visit. They can share the URL of the result set page with other people, etc.
Just be sure to eventually prune the data store as it will just keep on growing, but that's another matter entirely.
I am writing a simple user/login system in Php with postgresql.
I have a function that confirms whether username/passwords exists, which gets activated when a user presses the Login button.
public function confirmUserPass($username, $password){
$username=pg_escape_string($username);
/* Verify that user is in database */
$q = "SELECT password FROM users WHERE email = '$username'";
$result = pg_query($this->link,$q);
/* Do more operations */
}
I want to print the query stored in $results such that I can see it on the browser. When I do it in phppgAdmin using SQL it shows me the output but I cannot see it on the browser. I tried echo and printf but I could not see anything on the browser. I also tried to see view source from the browser but it shows nothing.
Can somebody help me with that?
Regards
From your code: $result = pg_query($this->link,$q);
As you've found already, trying to display the contents of $result from the line above will not give you anything useful. This is because it doesn't contain the data returned by the query; it simply contains a "resource handle".
In order to get the actual data, you have to call a second function after pg_query(). The function you need is pg_fetch_array().
pg_fetch_array() takes the resource handle that you're given in $result, and asks it for its the next set of data.
A SQL query can return multiple results, and so it is typical to put pg_fetch_array() into a loop and keep calling it until it returns false instead of a data array. However, in a case like yours where you are certain that it will return only one result, it is okay to simply call it once immediately after pg_query() without using a loop.
Your code could look like this:
$result = pg_query($this->link,$q);
$data = pg_fetch_array($result, NULL, PGSQL_ASSOC);
Once you have $data, then you've got the actual data from the DB.
In order to view the individual fields in $data, you need to look at its array elements. It should have an array element named for each field in the query. In your case, your query only contains one field, so it would be called $data['password']. If you have more fields in the query, you can access them in a similar way.
So your next line of code might be something like this:
echo "Password from DB was: ".$data['password'];
If you want to see the raw data, you can display it to the browser using the print_r() or var_dump() functions. These functions are really useful for testing and debugging. (hint: Wrap these calls in <pre> tags in order for them to show up nicely in the browser)
Hope that helps.
[EDIT: an after-thought]
By the way, slightly off-topic, but I would like to point out that your code indicates that your system may not be completely secure (even though you are correctly escaping the query arguments).
A truly secure system would never fetch the password from the database. Once a password has been stored, it should only be used in the WHERE clause when logging in, not fetched in the query.
A typical query would look like this:
SELECT count(*) n FROM users WHERE email = '$username' AND password = '$hashedpass'
In this case, the password would be stored in the DB as a hashed value rather than plain text, and the WHERE clause would compare that against a hashed version of the password that has been entered by the user.
The idea is that this allows us to avoid having passwords accessible as plain text anywhere in the system, which reduces the risk of hacking, even if someone does manage to get access to the database.
It's not foolproof of course, and it's certainly not the whole story when it comes to this kind of security, but it would definitely be better than the way you seem to have it now.
You must connect to database , execute query, and then fetch results.
try this example from php.net
<?php
public function confirmUserPass($username, $password){
$username=pg_escape_string($username);
// Connecting, selecting database
$dbconn = pg_connect("host=localhost dbname=publishing user=www password=foo")
or die('Could not connect: ' . pg_last_error());
// Performing SQL query
$query = "SELECT password FROM users WHERE email = '$username'";
$result = pg_query($query) or die('Query failed: ' . pg_last_error());
// Printing results in HTML
echo "<table>\n";
while ($line = pg_fetch_array($result, null, PGSQL_ASSOC)) {
echo "\t<tr>\n";
foreach ($line as $col_value) {
echo "\t\t<td>$col_value</td>\n";
}
echo "\t</tr>\n";
}
echo "</table>\n";
// Free resultset
pg_free_result($result);
// Closing connection
pg_close($dbconn);
?>
}
?>
http://php.net/manual/en/book.pgsql.php