I'm trying to login into a website using PHPQuery's WebBrowser plugin. I'm able to successfully login but I'm not sure how to reuse cookies from a previous call to the next.
$client = phpQuery::browserGet('https://website.com/login', 'success1');
function success1($browser) {
$handle = $browser
->WebBrowser('success2');
$handle
->find('input[name=name]')
->val('username');
$handle
->find('input[name=pass]')
->val('password')
->parents('form')
->submit();
}
function success2($browser) {
print $browser; // prints page showing I'm logged in
// make authenticated requests here
}
How do I make other requests with session/login cookies?
I've taken a look at the source code to aid you with this problem. My first impression was that the code was very poorly written. Debugging code commented out, typos all over the place, mile-long functions, etc. You really might want to consider switching to a different solution in the long run because if the author changes something in this code, you might end up having your own code broken with an upgrade.
That being said, the WebBrowser plugin gives you access to the browser object itself, which contains a function called getLastResponse(). This returns a Zend_Http_Response object, which you can theoretically use to get the cookies.
The problem is that you don't have any way of setting those cookies. You would have to patch the web browser plugin somewhere around line 102 to include your own HTTP request object (parameter 2 for phpQuery::ajax()) with your cookies set, around here:
$xhr = phpQuery::ajax(array(
'type' => 'GET',
'url' => $url,
'dataType' => 'html',
));
Alternatively you could also patch phpQuery.php line 691 to include a global cookie jar you could define as a singleton or so. (Right where it says $client->setCookieJar();).
Again, this code is very poorly written, you are probably MUCH better off using raw curl calls, even if it lacks a bit of functionality.
Related
I have a very special problem and I don't know how to deal with it.
I have web App in Laravel, when i open index page, I receive text message to my mobile phone.
Problem is, sometimes I receive 2 messages or 3, sometimes 1.
Is there a tool how to debug this strange behavior which is not always the same?
A few words about my code:
user opens the page, and because its first visit Session doesn't have attribute message_sent and SendTextMessage::SendMessage($phoneNumber, $id_message, $smsCode, $newDateFormat); is executed. After that Session has message_sent and can't be sent again, for example if I refresh the page.
SendTextMessage::SendMessage() is Class in Laravel Helpers.
controller code:
public function index($url_attribute, $id_message, Request $request)
{
if(!Session::has('message_sent'))
{
$user = User::where('id_message', $id_message)->first()->toArray();
$phoneNumber = $user['mobile_phone'];
$smsCode = $user['sms_code'];
$newDateFormat = date("d.m.yy", strtotime($smsExpirationTime));
$request->session()->flash('message', 'Text message sended.' );
SendTextMessage::SendMessage($phoneNumber,$id_message, $smsCode, $newDateFormat);
Session::put('message_sent', true);
}
return view('login');
}
SendTextMessage Class:
class SendTextMessage
{
public static function SendMessage($phoneNumber, $id_message, $smsCode, $newDateFormat)
{
$sms = new Connect();
$sms->Create("user","pass",Connect::AUTH_PLAIN);
$sms->Send_SMS($phoneNumber,"Message");
$sms->Logout();
}
}
Many thanks for any tip or help.
UPDATE:
problem is only in Chrome.
Edge and internet explorer are fine.
As this script runs on server-side the browser shouldn't be an issue. Based on your code provided, there is no clear answer to give here.
Please try the following in order to debug your problem:
Log messages at each stage of the script in order to see which part was called how often. That will help you to locate the problem. You can use \Log::error("Message") to do that.
Once you know where the problem might be, try to log "decision" making / mission critical variables to logile as well. E.g. \Log::error($session) so that you can understand why that problem might occur. One reason could be that you have a bad configured session caching or your cookies might be messed up. At some point there is probably a piece of data not the way you expect it to be.
You should maybe try to change the way you use Laravel Session.
You indicated that it was working fine on some browsers, that means your server-side code is correct so far, but there is someting messing with Chrome… From there,
if you take a quick look at the Laravel Session doc, you'll see that Session can be stored in cookies, and I bet that this is your actual setup (check in your .env file the SESSION_DRIVER constant, or in your config/session.php file).
If so, to confirm that this cookies-based session setting is the culprit, you might want to change the Session config to make it browser-independent: any other option than cookies will work, the database or file options might be the easier to setup… And if it works I would strongly encourage you to keep using this no-cookie setting to make your code browser-safe.
Im looking for an elegant way to hand over data/params when using $f3->reroute();
I have multiple routes configured in a routes.ini:
GET #sso: /sso/first [sync] = Controller\Ccp\Sso->first, 0
GET #map: /map [sync] = Controller\MapController->second, 3600
Now I reroute(); to #map route, from first();
class Sso {
public function first($f3){
$msg = 'My message!';
if( !empty($msg) ){
$f3->reroute('#map');
}
}
}
Is there any "elegant" way to pass data (e.g. $msg) right into $MapController->second(); ?
I don´t want to use $SESSION or the global $f->set('msg', $msg); for this.
This isn't an issue specific to fat-free-framework, but web in general. When you reroute, you tell the browser to redirect the user's browser page using a 303 header redirect code.
Take a minute to read the doc regarding re-routing: http://fatfreeframework.com/routing-engine#rerouting
There seems to be some contradicting information in your question, which leads me to question the purpose of what you are trying to achieve.
If you are rerouting, you can either use the session, cookies, or use part of the url to pass messages or references to a message.
If you do not need to redirect, but just want to call the function without changing the passed parameters, you could abstract the content of the function and call that function from both routes. You could also use the $f3 globals, which are a great way of passing data between functions in cases where you don't want to pass the data using the function call. is there a reason why you don't want to to use this? The data is global for the single session, so there is no security concern, and the data gets wiped at the end of the request, so there is very little extra footprint or effect on the server.
If you're alright with not using #map_name in re-routes you can do something like this:
$f3->reroute('path/?foo=bar');
Not the prettiest I'll admit. I wish $f3->reroute('#path_name?foo=bar') would work.
I'm using Kohana 3.2, and I want to be able to call another script (unrelated to Kohana, outside of its 'jurisdiction') that returns a application/json response.
When I tried using:
$response = json_decode(Request::factory('/scripts/index.php?id=json')->execute()->body());
It errors out saying there's no route to scripts/index.php. So I tried using Request_Client_External
Request_Client_External::factory()->execute(Request::factory('/scripts/index.php?page=s'))->body();
Gives me Request_Exception [ 0 ]: Error fetching remote /scripts/index.php?page=s [ status 0 ] Could not resolve host: scripts; Host not found. It appears it need a full flagged URL using http/https, but how to avoid the overhead of it doing a real external request?
Doing a
Request::factory(url::site('/scripts/index.php?page=s', 'http'))->execute()
works but is it considered "external"?
The short answer to your question is that the only way to use Request::factory()->execute() to achieve that is to use pass it the full url (with whatever "overhead" that entails, which shouldn't be too much: your server's probably quite good at talking to itself).
Otherwise, ideally you'd put the functionality of scripts into a library and call that from Kohana. However it sounds like that's not an option for you. If you have to leave /scripts/index.php untouched and insist on an 'internal' request, you could use PHP's output buffering, as illustrated below. But there are a bunch of caveats so I wouldn't recommend it: the best way is passing a full url.
// Go one level deeper into output buffering
ob_start();
// Mimic your query string ?id=json (see first caveat below)
$_GET = $_REQUEST = array('id' => 'json');
// Get rid of $_POST and $_FILES
$_POST = $_FILES = array();
// Read the file's contents as $json
include('/scripts/index.php');
$json = ob_get_clean();
$response = json_decode($json);
Some caveats.
Firstly, the code changes $_GLOBALS. You probably don't use these in your Kohana code (you use $this->request->get() like a good HMVCer, right?). But in case you do, you should 'remember' and then restore the values, putting $old_globals = $GLOBALS; etc. before the above code, and $GLOBALS = $old_globals; after.
Sessions: if your /scripts/index.php uses `session_start() this will cause a warning if you've already started a session at this point in Kohana.
Note that all variables set in scripts/index.php will remain set in the context you're in. If you want to avoid possible conflicts with that context, you'd start a new context, i.e. wrap the above into its own function.
Finally, you'd also need to make sure that /scripts/index.php doesn't do anything like Kohana::base_url = 'something_else', or touch any other static attributes, or do something catastrophic using this.
First: please forgive me - Im a bit of a novice as some of this...
I have a working test site which is running the php facebook SDK to perform some simple graphAPI requests successfully. Namely read a group's feed, which the user is a member of, and process this and display it back on a webpage.
This all works fine, the problem I have encountered is when trying to perform the same request via a php curl POST to another webpage (on the same domain). It seems that the SDK does not carry the expected session to another page when a post request is formed (see "AUTH ERROR2" in code)...this works fine when the following file is included via a "require_once" but not when a curl is made.
I would much rather do a "curl" as Im finding when a "require_once" is done from a page in a different directory level, Im getting php errors of the page not being found - which is expected.
I may just be tackling this problem all wrong...there may be a simpler way to make sure when files are includes, their correct directly level remains intact, or there may be a way to send over the currently authorised facebook sdk session via a curl post. All of which I have tried to no avail, and I would really appreciate any help or advise on this.
Thank you for your time.
//readGroupPosts.inc.php
function readGroupPosts($postVars)
{
//$access_token = $postVars[0];
// ^-- I'm presuming I need this? I have been experimenting appending it to
// the graphAPI request to no success...
$groupID = $postVars[1];
$limit = $postVars[2];
require_once("authFb.inc.php"); //link to the facebookSDK & other stuff
if ($user) {
try {
$groupFeed = $facebook->api("/$groupID/feed?limit=$limit"); //limit=0 returns all;
$groupFeed = $groupFeed['data']; //removes first tier of array for simpler access
$postArray;
for($i=0; $i<count($groupFeed); $i++)
{
$postArray[$i] = array($groupFeed[$i]['from']['name'], $groupFeed[$i]['message'], $groupFeed[$i]['updated_time'], count($groupFeed[$i]['likes']['data']));
}
return $postArray;
} catch (FacebookApiException $e) {
error_log($e);
$user = null;
return "AUTH ERROR1"; //for testing..
}
}
else
{
return "AUTH ERROR2"; //no user is authenticated i.e. $user == null..
}
}
I would much rather do a "curl" as Im finding when a "require_once" is done from a page in a different directory level, Im getting php errors of the page not being found - which is expected.
I may just be tackling this problem all wrong...
Definitively.
Using cURL as a “workaround” just because you’re not able to find your way around your server’s file system is an outrageous idea. Don’t do it. Stop even thinking about it. Now.
there may be a simpler way to make sure when files are includes, their correct directly level remains intact
Yes – for example, to use absolute paths instead of relative ones. Prefixing the path with the value of $_SERVER['DOCUMENT_ROOT'] for example – that way, once you’ve given the path correctly in respect to this “base path”, it does not matter where you’re requiring the file from, because an absolute path is the same no matter from where you look at it.
(And since this is not a Facebook-related problem at all, but just concerns basics of PHP and server-side programming, I’ll edit the tags.)
How would I go about writing a simple PHP bot that could login in and recieve all the cookies sent by the server? ... and then send them back when required?
Any suggestions would be appreciated.
First of all, your question is too broad and lacking in detail to really answer effectively. That said, I'll give it a try.
Not knowing what exactly you mean by "log in", I assume you want the script to be able to post some data to another script via an HTTP request. The CURL Library is good for that. It is able to post data and handle cookies.
Edit: Got ninja'd by Zed. ;)
If for some reason you cannot use the curl extension on your server (extension not installed), you can use a class such as Snoopy which will still allow you to either use the curl binaries or use sockets to retrieve the information.
Snoopy handles cookies.
As for the writing the bot itself, it's just a question of sending the proper requests. Here is an example with Snoopy:
$snoopy = new Snoopy;
// The following needs to reflect the form configuration of the site
$login = array('usr' => 'hi', 'pwd' => 'hello');
if($snoopy->submit('http://example.com/login', $login) === false) {
// output the response code
die($snoopy->response_code . ':' . $snoopy->error);
}
//Request succeeded (doesn't mean we are logged in)
// output the results
echo $snoopy->results;
// Check the results to see if you are logged in and
// Continue using $snoopy.
// It will pass the proper cookies for the next requests.
With the help of the cURL library?