looping through url - php

I want to do a loop, normally it is done with while do for etc but when the process is big I came up with a solution to refresh the page by echoing a javascript to refresh the page for the next loop.
for example:
The page is http://localhost/index.php --> this preforms the first iteration with $i=1;
at the end of the script it will be redirected to http://localhost/index.php?i=$i++
if (!$_GET['i']){
$i = 1;
}else{
$i = $_GET['i'];
}
if ($i<500){
// proceed with $i = $_GET['i']
//then redirect to http://localhost/index.php?i=$i++
}else{
echo "done";
}
Now, consider a situation that the imput parameters come from a FORM to this script. (i.e. $parameter1 , $parameter2, $parameter3)
Then I have to pass them every time to new url (next iteration).
At normal work I can pass them as GET variable to new url but how can I pass them if I don't want the user be able to see the value of parameters in url?

At normal work I can pass them as GET variable to new url but how can I pass them if I don't want the user be able to see the value of parameters in url?
You can not with the bare redirect, but if you're talking about a specific user, you can do so by assigning those parameters as session variables Docs and then passing the session id as an additional parameter (or trust the user has cookies enabled).
function do_redirect($i, Array $parameters)
{
$i = (int) $i;
$parameters['i'] = $i; // save to session as well
$_SESSION['parameters'] = $parameters;
// redirect to http://localhost/index.php?i=$i&SID
}
if (is_form_request())
{
$parameters = get_form_parameters();
do_redirect(1, $parameters);
}
elseif (is_redirect_loop_request())
{
$parameters = $_SESSION['parameters'];
$i = $parameters['i'];
if ($i < 500)
{
do_redirect($i++, $parameters);
} else {
echo "done.";
}
}

Not to be rude, but both answers above are quite prone to security issues (but the session solution is the best one). As for the 'encryption' solution of #itamar: that's not exactly encryption... This is called 'Caesar cypher' (http://en.wikipedia.org/wiki/Caesar_cipher), which is indeed as safe as a paper nuclear bunker...
It can be much easier and safe as can be; do not save the iteration in the session, but in the database. For the next request, the only thing you have to do is get the iterator from the database and go on with whatever you want to do. Sessions can be stolen, meaning someone could let you iterate from, say, $i=10 a thousand times. It cannot be done when the iterator is stored in a secure database.

Related

paginated api request, how to know if there is another page?

I am creating a PHP class that use a 3rd party API. The API has a method with this request URL structure:
https://api.domain.com/path/sales?page=x
Where "x" is the page number.
Each page return 50 sales and I need to return an undefined number of pages for each user (depending on the user sales) and store some data from each sale.
I have already created some methods that get the data from the URL, decode and create a new array with the desired data, but only with the first page request.
Now I want to create a method that check if is there another page, and if there is, get it and make the check again
How can I check if there is another page? And how to create a loop that get another page if there is one?
I have already this code, but it create an infinite loop.
require('classes/class.example_api.php');
$my_class = new Example_API;
$page = 1;
$sales_url = $my_class->sales_url( $page );
$url = $my_class->get_data($sales_url);
while ( !empty($url) ) {
$page++;
$sales_url = $my_class->sales_url( $page );
$url = $my_class->get_data($sales_url);
}
I don't use CURL, I use file_get_content. When I request a page out of range, I get this result:
string(2) "[]"
And this other after json_decode:
array(0) { }
From your input, in the while loop, you change the $url (which actually holds the data return by the API call) and this is checked for emptiness, if I'm correct.
$url = $my_class->get_data($sales_url);
If the above is just the original response (so in case of page out of range a string "[]"), it will never get empty("[]") to true. So my guess is that the return value from get_data is this string, while it should be the actual array/json even if the result is empty (ie I suspect that you perform the json_decode once you have collected the data e.g. outside the loop).
If this is the case, my suggestion would be to either check for "[]" in the loop (e.g. while ($url !== "[]")) or within the loop decode the response data ($url = json_decode($url)).
From my experience with several API's, the response returns the number of rows found, and x number per page starting with page 1.
In your case, if the response has the number of rows then just divide it by the x number page and loop through the results as page numbers.
$results = 1000;
$perPage = 50;
$pages = ceil($results/$perPage);
for (i=1; $i <= $pages; $i++){
// execute your api call and store the results
}
Hope this help.
From the responses you've shown, you get an empty array if there are no results. In that case, you could use the empty method in a loop to determine if there's anything to report:
// Craft the initial request URL
$page = 1;
$url = 'https://api.domain.com/path/sales?page=' . $page;
// Now start looping
while (!empty(file_get_contents($url)) {
// There's data here, do something with it
// And set the new URL for the next page
$url = 'https://api.domain.com/path/sales?page=' . ++$page;
}
That way it will keep looping over all the pages, until there is no more data.
Check http response headers for total number of items in set

Can I retry file_get_contents() until it opens a stream?

I am using PHP to get the contents of an API. The problem is, sometimes that API just sends back a 502 Bad Gateway error and the PHP code can’t parse the JSON and set the variables correctly. Is there some way I can keep trying until it works?
This is not an easy question because PHP is a synchronous language by default.
You could do this:
$a = false;
$i = 0;
while($a == false && $i < 10)
{
$a = file_get_contents($path);
$i++;
usleep(10);
}
$result = json_decode($a);
Adding usleep(10) allows your server not to get on his knees each time the API will be unavailable. And your function will give up after 10 attempts, which prevents it to freeze completely in case of long unavailability.
Since you didn't provide any code it's kind of hard to help you. But here is one way to do it.
$data = null;
while(!$data) {
$json = file_get_contents($url);
$data = json_decode($json); // Will return false if not valid JSON
}
// While loop won't stop until JSON was valid and $data contains an object
var_dump($data);
I suggest you throw some sort of increment variable in there to stop attempting after X scripts.
Based on your comment, here is what I would do:
You have a PHP script that makes the API call and, if successful, records the price and when that price was acquired
You put that script in a cronjob/scheduled task that runs every 10 minutes.
Your PHP view pulls the most recent price from the database and uses that for whatever display/calculations it needs. If pertinent, also show the date/time that price was captured
The other answers suggest doing a loop. A combo approach probably works best here: in your script, put in a few loops just in case the interface is down for a short blip. If it's not up after say a minute, use the old value until your next try.
A loop can solve this problem, but so can a recursive function like this one:
function file_get_contents_retry($url, $attemptsRemaining=3) {
$content = file_get_contents($url);
$attemptsRemaining--;
if( empty($content) && $attemptsRemaining > 0 ) {
return file_get_contents_retry($url, $attemptsRemaining);
}
return $content;
}
// Usage:
$retryAttempts = 6; // Default is 3.
echo file_get_contents_retry("http://google.com", $retryAttempts);

Preserving $_POST variables when paginating

I have a simple list of orders which a user can filter by status (open,dispatched,closed). The filter dropdown triggers a post to the server and sends the filter value through. Orders are listed out 10 to a page with pagination links for any results greater than 10. Problem is when I click the pagination links to view the next page of results the filter value in the post is lost.
public function filter_orders() {
$page = ($this->uri->segment(4)) ? $this->uri->segment(4) : 0;
$filter = $this->input->post('order_status_filter');
$config = array();
$config["base_url"] = base_url() . "control/orders/filter_orders";
$config["per_page"] = 10;
$config['next_link'] = 'Next';
$config["uri_segment"] = 4;
$config['total_rows'] = $this->model_order->get_all_orders_count($this->input->post('order_status_filter'));
}
How can I make the pagination and filter work together. I've thought about injecting a query string in to the pagination links but it doesn't seem like a great solution.
The answer is very simple, use $_GET. You can also use URI segments.
i.e, index.php/cars/list/5/name-asc/price-desc'
The main reason you'll want to use $_GET is so you can link other users so they see the same result set you see. I'm sure users of your web app will want this functionality if you can imagine them linking stuff to each other.
That said, it would be ok to ALSO store the filters in the session so that if the user navigates away from the result set and then goes back, everything isn't reset.
Your best bet is to start a session and store the POST data in the session. In places in your code where you check to see if the user has sent POST data, you can check for session data (if POST is empty).
In other words, check for POST data (as you already do). If you got POST data, store it in the session. If a page has no POST data, check to see if you have session data. If you do, proceed as if it was POSTed. If you have both, overwrite the session with POST. You'll want to use new data your user sent you to overwrite older data they previously sent.
You either put everything in $_GET or if the data is sensible, put it in $_SESSION. Then it travels between pages.
In your case there seem to be no reason to put your filter data anywhere else than in $_GET.
A query string does seem the best solution. You could store it in the session or in cookies as well, but it makes sense to also store it in the query string.
Store it in cookies or the session if you want to remember the user's choice. Which seems like a friendly solution. It allows the user to keep their settings for a next visit, or for another page.
Store it in the query string, because going to 'page 2' doesn't tell you anything if you don't know about filters, page size or sorting. So if a user wants to bookmark page 2 or send it by e-mail, let them be able to send a complete link that contains this meta information.
Long story short: Store it in both.
maybe its not a right answer but, give it a try
<?php
// example url
$url = "index.php?page=6&filter1=value1&filter2=value2";
// to get the current url
//$url = "http://".$_SERVER["HTTP_HOST"].$_SERVER["REQUEST_URI"];
// change the page to 3 without changing any other values
echo url_change_index( $url, "page", 3 );
// will output "index.php?page=3&filter1=value1&filter2=value2"
// remove page index from url
echo url_change_index( $url, "page" );
// will output "index.php?filter1=value1&filter2=value2"
// the function
function url_change_index( $url, $name = null, $value = null ) {
$query = parse_url( $url, PHP_URL_QUERY );
$filter = str_replace( $query, "", $url );
parse_str( $query, $parsed );
$parsed = ( !isset( $parsed ) || !is_array( $parsed ) ) ? array() : $parsed;
if ( empty( $value ) ) {
unset( $parsed[$name] );
}
else {
$parsed[$name] = $value;
}
return $filter.http_build_query( $parsed );
}
?>

Zend Php Foreach Loop array

I have a input field and a array of email from DB table.
I am comparing the similarity of the input field to the array.
But some how, I am stuck with the loop.
Does that loop check compare each email with the input field?
It always bring me to google.com no matter what i input same or not
Here's the code from the controller:
if (isset($_POST['btn_free_download']))
{
// Get current input email from input field
$email = $this->getRequest()->getParam('email');
// Get all emails from the user
$referred_users = $this->_helper->user()->getReferredUsers()->toArray();
// Check the similarity which it with a loop
foreach ($referred_users as $referred_user)
{
similar_text($email, $referred_user['email'], $similar);
}
// If yes, Pop up message or redirect to some page
if ($similar < 97)
{
$this->_redirect('http://google.com');
}
// If not, redirect user to free download page
else
{
$this->_redirect('http://yahoo.com');
}
}
I think you need to check the manual . Foreach function is same wether you use it on zend or any other framework or raw php only.
$referred_users = $this->_helper->user()->getReferredUsers()->toArray();
$referred_users will probably hold an array of emails from the table
user, say:
$referred_users = array("one#email.com", "two#email.com", "three#email.com")
then when you use foreach loop it will iterate through each of the
emails in the array
foreach ($referred_users as $referred_user)
{
// for the first loop $referred_user = one#email.com, for second time $referred_user = two#email.com and goes on
similar_text($email, $referred_user['email'], $similar);
}
Now let us discuss your logic here:
// If yes, Pop up message or redirect to some page
if ($similar < 97)
{
$this->_redirect('http://google.com');
}
// If not, redirect user to free download page
else
{
$this->_redirect('http://yahoo.com');
}
Until and unless the last element in the array $referred_users is exactly equal to your $email
i.e. $email = "three#email.com"
you will always be given result for $similar less than 97% which means you will be redirected to google.
Which I assume you are not trying to do and probably not familiar with foreach function which is why you are not getting the expected result.
Assuming you are trying to do something like, check for the emails in the array if any of the email in the array matches (if array is from table check if the email entered from param is equal to any of the emails in the table) then redirect to somewhere or show some message else carry on. Solution below might be helpful to you.
$similarText = false;
foreach ($referred_users as $referred_user)
{
// for the first loop $referred_user = one#email.com, for second time $referred_user = two#email.com and goes on
similar_text($email, $referred_user['email'], $similar);
if ($similar > 97) {
$similarText = true;
break;
}
}
// If yes, Pop up message or redirect to some page
if ($similarText)
{
$this->_redirect('http://google.com');
}
// If not, redirect user to free download page
else
{
$this->_redirect('http://yahoo.com');
}
Hope you got the idea. But please do check the manual before posting a question in the future.

How do I use cookies to store users' recent site history(PHP)?

I decided to make a recent view box that allows users to see what links they clicked on before. Whenever they click on a posting, the posting's id gets stored in a cookie and displays it in the recent view box.
In my ad.php, I have a definerecentview function that stores the posting's id (so I can call it later when trying to get the posting's information such as title, price from the database) in a cookie. How do I create a cookie array for this?
**EXAMPLE:** user clicks on ad.php?posting_id='200'
//this is in the ad.php
function definerecentview()
{
$posting_id=$_GET['posting_id'];
//this adds 30 days to the current time
$Month = 2592000 + time();
$i=1;
if (isset($posting_id)){
//lost here
for($i=1,$i< ???,$i++){
setcookie("recentviewitem[$i]", $posting_id, $Month);
}
}
}
function displayrecentviews()
{
echo "<div class='recentviews'>";
echo "Recent Views";
if (isset($_COOKIE['recentviewitem']))
{
foreach ($_COOKIE['recentviewitem'] as $name => $value)
{
echo "$name : $value <br />\n"; //right now just shows the posting_id
}
}
echo "</div>";
}
How do I use a for loop or foreach loop to make it that whenever a user clicks on an ad, it makes an array in the cookie? So it would be like..
1. clicks on ad.php?posting_id=200 --- setcookie("recentviewitem[1]",200,$month);
2. clicks on ad.php?posting_id=201 --- setcookie("recentviewitem[2]",201,$month);
3. clicks on ad.php?posting_id=202 --- setcookie("recentviewitem[3]",202,$month);
Then in the displayrecentitem function, I just echo however many cookies were set?
I'm just totally lost in creating a for loop that sets the cookies. any help would be appreciated
Don't set multiple cookies - set one that contains an array (serialized). When you append to the array, read in the existing cookie first, add the data, then overwrite it.
// define the new value to add to the cookie
$ad_name = 'name of advert viewed';
// if the cookie exists, read it and unserialize it. If not, create a blank array
if(array_key_exists('recentviews', $_COOKIE)) {
$cookie = $_COOKIE['recentviews'];
$cookie = unserialize($cookie);
} else {
$cookie = array();
}
// add the value to the array and serialize
$cookie[] = $ad_name;
$cookie = serialize($cookie);
// save the cookie
setcookie('recentviews', $cookie, time()+3600);
You should not be creating one cookie for each recent search, instead use only one cookie. Try following this ideas:
Each value in the cookie must be
separated from the other with an
unique separator, you can use . ,
; or |. E.g: 200,201,202
When
retrieving the data from the cookie,
if it exists, use
explode(',',CookieName);, so you'll
end up with an array of IDs.
When adding
data to the cookie you could do,
again, explode(',',CookieName); to
create an array of IDs, then check if the
new ID is not in the array using
in_array(); and then add the value
to the array using array_push();.
Then implode the array using
implode(',',myString); and write
myString to the cookie.
That's pretty much it.

Categories