I am thinking of a way to send a request to page, and receive a request from a page and then read it. The problem is that I dont know what is the most efficient way of achieving so.
$result=file_get_contents('scan_results.php');
$scanning=true;
while($scanning){
if(stristr($result,'Nothing to scan.')){
$scanning=false;
}
}
Right now, I simply send a request to the page and try to scrape the page for the sentence 'Nothing to scan.'.
The page scan_results.php, should do a few tasks, so I loop it until I get 'Nothing to scan'.. So basically, I am thinking of using a better way to check when I get the sentence nothing to scan..
Try cURL: http://php.net/manual/en/book.curl.php
Related
I dont know how to explain my need, and neither which key words to use to find a solution on google, so i'll give an url to be more clear:
check an IP (click on: Check your current IP address)
I'ld like, by using this website for example, getting somes informations after all the processus are terminated.
I tried with "file_get_contents" and with "cURL functions" but i did not find a way to do it, i always get the original source code.
Any idea ?
EDIT:
<body onLoad="setTimeout('get_my_blacklist()', 60000)">
...
...
<?php
echo '<iframe id="my_iframe" src="http://multirbl.valli.org/lookup/'.$ip.'.html">';
?>
...
...
<script>
function get_my_blacklist()
{
//function to get the content after somes secondes.
}
</script>
Here is the new code i tried thank to #Ludovic for is iframe idea.
Still working on it, i'll tell you if its working or not to solve my issue.
Edit2: Whatever how i try, i didnt find a way to get the containt of my frame window.. And even if i'ld succeed, i dont know how i can update my database if do it with JQuery/Javascript
First the page should have been construct by server script like PHP, at this step you have all IP requested then the page is modified by JQuery script who seems to query each IP.
The second step is an asynchronous script so you can't know when the page is effectively finished to construct.
I got a form, that send post request and show paginated results. There are problems, when i want to see pages number 2 and more, because there sended get request and controller doesn't see form to create query. Anyone know how to solve this problem?
I'm using symfony(1.4) and I dont know if there's a big difference between 2.x
so let me discuss about it...
creating url's you should use
<?php url_for('page/view?num='.$page_num) ?>
something like that, then you can now use the request of your module
#app/{apps_name}/module/page/{actions.class.php} or {pageActions.class.php}
to your view method
public function executeView(sfWebRequest $request)
{
$page_num = $request->getParameter('num');
echo $page_num;
}
you should get what the page number now.
one more thing, this only works in $_GET requests.
You should know how to use Routing, to configure atleast 3 parameters. It will help you to use $_POST requests.
I think easiest way would be to make the search a get request so that is in the url. Then the paginated links will have the search value too.
Is it a good practise, to use the php session object, to store several of my variables, can be arrays of request results.
I need this method, because I would like to do the request in a php file, store the result and immediately, (depending on result) redirect to a page,
It's probably not the best way, that's why I'm asking
thx for any advice,
edit: structure:
index.html
handler.php
view1.php
in index.html, I've got a
<form action="handler.php" ...
in handler.php, I construct a request and get a result,
if ($result->success)
header("location ./view1.php");
else
echo 'failed';
in view1.php, I would like to list the result array
Webshops do it - so why shouldn't you?
Some of the larger eCommerce frameworks store complicated data and objects in sessions and PHP handles this pretty well.
That's what sessions are for! So the general answer is "Yes: it's a good practice".
Here are some alternatives, however:
Consider using ajax calls to update parts of the loaded page without reloading it;
Cookies - not good for big amount of data, but generally can live longer than a session. Not useful in your particular case, however;
SQL servers are usually well-optimized, and when your query returns lots of rows and you cut those into sections with a LIMIT clause, or just repeat exactly the same request soon after the first time, the subsequent requests aren't of such a big load for the database server.
I just seen your update to the question.
AJAX can do the trick for you the best. I can imagine it all done within a single web page:
form data is submitted by an AJAX call to you handler.php, which..
returns either a JSON-packed array of results or a short string NOT FOUND, for example.
Then, the JS on your page either creates a new DOM element - a table, or a set of div's, with the returned results, or just creates a new div with some sad toon face and a "we didn't find anything' message.
// set session
session_start();
$_SESSION['my_session'] = array('var1' => 'value1', 'var2' => 'value2'); // your result list
session_write_close();
// get Session
echo ($_SESSION['my_sesson']['var1']);
if ($result->success)
header("location ./view1.php");
else
echo 'failed';
This is not good practice to use redirects to route requests. You can do it without additional request from the user.
Like this:
if ($result->success) {
include(dirname(__FILE__) .'/'. 'view1.php');
} else {
echo 'failed';
}
Thus, all variables from handler.php will be available in view1.php.
I want post tweets into facebook using php curl , this is my snippet I used for posting tweet into FB - FB CURL SNIPPET
But i am not find any updated tweet in my facebook,
am not sure but i thing somthing goes wrong,
Can you tell me, snippet is correct one or not?
Thanks
This calls for debugging.
First port of call: It could be that the cookies are not saved: Check whether the script actually generates a my_cookies.txt file. If it doesn't, create an empty one and do a chmod 777 on it.
Second port of call: curl_error().
Replace every curl_exec() call in the snippet by this:
$success = curl_exec(....... your options .....);
if (!$success) echo "CURL Error: ".curl_error();
this might give you some pointers as to what goes wrong.
However, seeing as the script tries to imitate a browser instead of using an API, it could be that the structure of the submission form has changed on Facebooks's side, in which case you'll have to parse the output cURL gives you and see what goes wrong.
All in all, if there is any way to do this cleanly through an API - I don't know whether there is - it would be much preferable to this.
I'm new to PHP and I'm trying to do something that may be bad practise and may well be impossible. I'm basically just hacking something together to test my knowledge and see what PHP can do.
I have one webpage with a form that collects data. That is submited to a PHP script that does a bunch of processing - but doesn't actually display anything important. What I want is that once the processing is done, the script then tells the browser to open another page, where the results are displayed.
I know I can use header('Location: page.php'); but I can't work out how to provide POST data with this. How can I do that? Alternatively, is there another way to tell the browser to open another page?
EDIT: What I'm taking from the responses is that it's possible to do this using various hacks but I'd be better off to just have the processing and the display code in one file. I'm happy with that; this was an experiment more than anything.
You could store that data in the session e.g. in the first file that handles the post
session_start();
$_SESSION['formdata'] = $_POST; //or whatever
then you can read it on the next page like
session_start();
print_r($_SESSION['formdata']);
or you could pass it through GET: (but as per comments this is a bad idea)
header('Location: page.php?' . http_build_query($_POST));
If you do that make sure you do additional processing/validation on page.php as a malicious user could change the variables. also you may not need the whole post transmitted to the next page
Edit
I should make clear that I think the second option is possibly worse, as you are limited by the size of data you can send through get and it is possibly less secure as users can more obviously manipulate the data.
Is it really necessary to call another page after the processing is done? I'd probably do the following:
<form method="post" action="display.php">
...
</form>
display.php:
if ($_POST) {
require_once(process.php);
process($_POST);
display_results;
}
with process.php containing the code necessary for processing the post request.
Alternatively, you could use something like the cURL library to pass the results of the processing to a page specified by yourself. Don't know if that's really what you're after though.
You could use JavaScript as a dirty work-around:
<form id="redirect_form" method="post" action="http://someserver.com/somepage.php">
<input type="hidden" name="field_1" value="<?php echo htmlentities($value_1); ?>">
<input type="hidden" name="field_2" value="<?php echo htmlentities($value_2); ?>">
<input type="hidden" name="field_3" value="<?php echo htmlentities($value_3); ?>">
</form>
<script type="text/javascript">
document.getElementById('redirect_form').submit();
</script>
(the script should be below the form)
There's no way to redirect the user's browser to an arbitary page and sent a POST request. That would be a bit security risk, where any link could cause you to make any form submission to an arbitrary site without you having any kind of clue about what was going to happen.
In short, it's not possible
AFAIK this is usually done as a two-step process:
On form.php, POST the data to script process.php
The process.php script processes the data but never outputs anything itself, it always calls header("Location: asdasd") to redirect to a success.php or failure.php page (if applicable)
Do it all in one script and just output different HTML for the results.
<?php if($doingForm) { ?>
html for form here
<?php } else { ?>
html for results
<? } ?>
This problem has vexed me for some time. My custom CMS does some quite complex processing, uploading and manipulation, and so sometimes ouputs quite lengthy error and information messages, which aren't suitable for converting to GET data, and I have always wanted to avoid the reload problem on data INSERT, but not yet found an adequate solution.
I believe the correct way to go about this, is to create message arrays for each possible state - each message or error you could want to display, and then you only need to send error/message numbers which are a lot easier to handle than long data strings, but it's something I have always shied away from personally as I find it a bit tedious and cumbersome. Frankly, this is probably just laziness on my part.
I quite like the SESSION variable storage solution, but this raises the question of how do you ensure the SESSION data is properly destroyed?
As long as you ensure you are only sending information (messages/errors) and not data that should/could be stored (and thus potentially sensitive) this should be an avoidable problem.
I hope i got your qestion right.
You might try this:
Adjust your form to look like this:
form method="POST" action="process_data.php"
2.
Then you create the file process_data.php, wich surprisingly processes the data.
And in this file you use header:
For example:
$head = sprintf("page.php?data1=%d?data2=%d",$data1,$data2);
header($head);
I hope i could help.