After recording script, I have added 'Cookie Manager'.
2. While running, Cookies are not showed for Request headers in Jmeter and the Connection closed error showing for listener.
But, Browser cookies are passing the request headers in my application.
So, How can i pass cookies in jmeter. kindly give me a solution.
Please refer the snapshot.
Thanks,
Vairamuthu.
Just add to Test Plan an HTTP Cookie Manager before your request.
I don't think you should be sending cookie. I would rather expect that you need to extract the value of this _xsrf cookie from the /api/login request and add the extracted value as X-Xsrftoken header.
Add the next line to user.properties file (located in JMeter's "bin" folder"
CookieManager.save.cookies=true
and just in case this line as well:
CookieManager.check.cookies=false
Restart JMeter to pick the properties up
Add HTTP Header Manager to the request which is failing
Make sure it contains the following like:
Name: X-Xsrftoken
Value: ${COOKIE__xsrf}
More information:
Configuring JMeter
How to Load Test CSRF-Protected Web Sites
Related
I use ngrok to tunnel localhost to a web address
./ngrok http 80
I use only custom PHP code. Last time I tested it was working ok. Now, I can't login because it seems my PHP resets the data stored in session every 5 or so requests.
When I say reset I mean that my code calls session_id() does not get it and resets that valuable session data including internal captcha code! At the end captcha comparison fails!
Everything works fine at localhost though!
I reset session.cookie_domain with ini_set() setting the ngrok url.
Any ideas?
At last I found it: for a address xxx.ngrok.io just set php session cookie for domain .xxx.ngrok.io and do not include http.
I have an ssl certificate on my web-site. Once images are loaded on the page from another site, it causes warnings kind of "the page contains both secure and nonsecure items", so you have to press OK or you see "broken" ssl connection in the browser. One of the ways to escape that warnings is to use http page instead of https, correct?
But, as far as I know, there is another way to exclude that warnings using php or just using javascript. I believe the images are loaded to the temporary folder on my server and are loaded as https images at the same time.
Could anybody tell me the best way to do that?
Browsing the forum didn't help me a lot.
Thank you.
So,
how to load
<?php echo '<img src="http://www.not_my_site.com/image.jpg" alt="">'; ?>
with no warnings on my page https://my_site.com/index.php ?
You cannot surpress the error as it's a browser thing.
The only way would be to wrap those calls using an https call on your site. Something like:
<?php echo '<a href="https://my_site.com/external.php?resource=http://www.not_my_site.com/image.jpg" alt="">'; ?>
You will have to write the external.php script to make the request on the client's behalf, and then return the content over your existing SSL connection. You only NEED to do this for external HTTP-only resources.
The process would work as follows:
The end user's web browser makes an HTTPS request to your external.php script.
Check for a saved copy of the resource. If you've got it cached then skip to step 6, returning the cached resource.
Your server forwards on the call to the HTTP resource specified as the resource.
The remote server responds to the request.
Save a copy of the resource for caching.
Your web server external.php script then returns that response over the SSL connection.
The web browser only makes 1 request, your web server just has to make an additional one.
This is the only way you'll be able to get rid of the message.
Looks even simpler to retrieve the image: use curl to download indirect image file
It happens cause your making non-secure (HTTP) calls from a secured-page (HTTPS).
try changing your code to:
<?php echo '<a href="https://www.not_my_site.com/image.jpg" alt="">'; ?>
I tried using curl to post to a local file and it fails. Can it be done? my two management systems are on the same server and it seems unnecessary to have it traverse the entire internet system just to go to a file on the same hard drive.
Using localhost didn't do the trick.
I also tried to $_SERVER[DOCUMENT_ROOT].'/dir/to/file.php' using post data. It's for an API that is encrypted, so I'm not sure exactly how it works. It's for a billing system I have and I just realized that it sends data back (API).
It's simply post data and an XML response. I could write an html form tag and input fields and get the same result, but there isn't really anything else to know.
The main question is: Does curl have the ability to post to a local file or not?
it is post data. it's for an API that is encrypted so i'm not sure exactly how it works
Without further details nobody can answer then what you should do.
But if it's indeed a POST receival script on the local server, then you can send a POST request to it using the URL:
$url = "https://$_SERVER[SERVER_NAME]/path/to/api.php";
And then receive its output from the cURL call.
$data = curl($url)->post(1)->postdata(array("billing"=>1234345))
->returntransfer(1)->exec();
// (you would use the cumbersome curl_setopt() calls instead)
So you get a XML or JSON or whatever response.
If they're on the same drive, then use file operations instead:
file_put_contents('/path/to/the/file', $contents);
Using CURL should only be done if you absolutely NEED the http layer to get involved for some reason, or if you're dealing with a remote server. Using HTTP would also mean you need to have the 'target' script be able to handle a file upload plus whatever other data you need to send, and then that script would end up having to do file operations ANYWAYS, so in effect you've gone on a round-the-world flight just so you can move from your living room to the kitchen.
file://locafilespec.ext worked for me. I had 2 files in the same folder on a linux box, in a folder that is not served by my webserver, and I used the file:// wrapper to post to file://test.php and it worked great. it's not pretty, but it'll work for dev until I move it to it's final resting place.
Does curl have the ability to post to a local file or not?
To curl local file, you need to setup HTTP server as file:// won't work, so:
npm install http-server -g
Then run the HTTP server in the folder where is the file:
$ http-server
See: Using node.js as a simple web server.
Then test the curl request from the command-line to localhost like:
curl http://127.0.0.1:8081/file.html
Then you can do the same in PHP.
i just want to ask if do i need to set something to enable the cookies from my hosting?
i have this
<?php
setcookie("TestCookie","Hello",time()+3600);
print_r($_COOKIE);
?>
it will function perfectly at my server which is xampp. but when i upload it to my hosting,
it will not function.. what should i do? or what will i add to the code?
It is also possible that the cookie is actually sent but the client doesn't sent the value back to the webserver in subsequent requests.
Possible causes:
your server's clock is misconfigured and therefore time()+3600 is in the past from the client's perspective => the client will delete the cookie immediately.
the cookie failed the general tail match check, search for "tail match" in http://curl.haxx.se/rfc/cookie_spec.html
the client is configured not to accept those cookies
There are many addons for different browsers that let you see the http headers the client actually received. E.g. Firebug for Firefox. Use them to check if there is a Set-cookie header in the response. If there is the client for some reasons rejected it. If there is no such header you have to check why the server didn't sent it.
Cookies are sent as http response headers. Those headers can only be sent before anything from the response body has been sent.
Make sure no output happens before setcookie():
no echo, printf, readfile
nothing outside the <?php ?> tags, not a single white-space or BOM in any of the scripts that have been included and executed for the same request before setcookie()
increase the error_reporting level and check for warnings/notices. Those messages are usually logged in a file (e.g. error.log). You can set display_errors to true to see them in the output.
PHP can be configured to use output buffers. In this case output is not sent directly to the client but held in a buffer until either it's full, the buffer is flushed or the script ends (implicit flush). So until the content of the buffer is actually sent to the client you can set/modify/delete http headers like cookies. see http://docs.php.net/outcontrol.configuration
Use web developer toolbar to view cookie.
My problem days in practicing PHP reminds me:
there may be new line ahead of <?php [this occurred to me when I use some proxies to upload]
free hosting providers include("top_ads.php") that is put on top of your php file [this occurred to me when I used free hosting]
If I'm not mistaken... Cookies are not accessible on the same page - has to be on the next page. Once cookies are set, you need to forward to another page via a header(); function, and THEN the $_COOKIE vars become accessible. It's not meant to work on the same page.
I'm using Eclipse and XDebug to develop a PHP application that relies on web services.
I have test pages that consume my services in 2 ways: AJAX (using jQuery) and cURL.
I add breakpoints to my service page and launch the debugger. When I call the the service from AJAX, execution stops nicely at the breakpoint, and I get my variables, step-by-step control etc.
But when I call the service using cURL (i.e. from within a PHP page), the breakpoints fail to function. Even if I turn on the "Break at first line" debugger option, I cannot get the execution to stop when using cURL.
Is it a debugger behavior? Do I need to add a hearder to my cURL calls? Alter the URL? Or is it an XDebug limitation?
Thanks for your time and effort,
Guy
I can't comment yet, so I post this as an answer.
Can you debug more than one AJAX request in one session?
Was your debug session still running in Eclipse when you tried to debug using cURL?
Description on how it works for me:
Start debug session with a simple debug.php file that contains only a <?php and nothing else. It stops on the first line, you "continue" it and it finishes execution.
Now request the script using cURL (or another browser) adding ?XDEBUG_SESSION_START=ECLIPSE_DBGP to its path (I even think this addition is optional)
Your script should show up in the debug view stopped at the first line
Hope ths helps.
Here is tip on how to trigger Xdebugger client from Curl without browser:
1- From command line:
curl -H "Cookie: XDEBUG_SESSION=1" http://YOUR-SITE.com/your-script.php
2- From PHP
<?php
$ch = curl_init ();
curl_setopt ($ch, CURLOPT_URL, 'http://YOUR-SITE.com/your-script.php');
curl_setopt ($ch, CURLOPT_COOKIE, 'XDEBUG_SESSION=1');
curl_exec ($ch);
?>
So it doesn't matter if you attach "XDEBUG_SESSION=1" to CURL URL, but what is necessary is to send a proper cookie together with request.
I know this is a pretty old thread, but I thought I'd post my experience for others that may come across it, like I did, with the same problem. What I discovered is that, if you are debugging remotely (which I always do), there are a couple settings you have to change in php.ini to make this work. Here are the ones that worked for me:
xdebug.remote_connect_back = false
xdebug.remote_host = {client host name or IP}
The first setting is normally "true," and tells xdebug to look for the client at the same IP address where the HTTP request originated. In this case however, the request is coming from the server, so that won't work. Instead you must use the second setting to tell xdebug where to find the client. Hope this helps save somebody a little time!
To trigger the debugger the simplest solution is to use the cookie approach -b XDEBUG_SESSION=ECLIPSE_DBGP worked for me on eclipse, see below:
curl -H 'Content-type: application/json' \
-b XDEBUG_SESSION="ECLIPSE_DBGP" \
-X POST \
-d '{"uid":200, "message":"asdsad","message_type":1}'
http://daxuebao.local:8083/api/message/send
When you are debugging the Ajax request, that one is sent by the browser, in the same navigation context as the other (non-Ajax) requests -- which is why it works fine.
The request sent by curl is in another, different, context -- and I'm not sure you can hook the debugger into that... But, maybe...
First of all, here's an information that might prove helpful, quoting the Xdebug's documentation :
Xdebug contains functionality to keep
track of a debug session when started
through a browser: cookies. This works
like this:
When the URL variable XDEBUG_SESSION_START=name is
appended to an URL Xdebug emits a
cookie with the name
"XDEBUG_SESSION" and as value the
value of the XDEBUG_SESSION_START
URL parameter.
When there is a GET (or POST) variable XDEBUG_SESSION_START or the
XDEBUG_SESSION cookie is set, Xdebug
will try to connect to a debugclient.
To stop a debug session (and to destroy the cookie) simply add the URL
parameter XDEBUG_SESSION_STOP.
Xdebug will then no longer try to make
a connection to the debugclient.
Maybe it might work if you set that cookie "by hand", sending it allong the curl request...
I suppose you'd first have to get its value, as set by Xdebug at the beginning of the debugging session -- re-using the cookie you have in your browser should be possible, though.
Note : I've never tried this -- if you try, and it works, could you please confirm it worked ?
I ran into this same exact issue. I solved it by turning the auto-start feature off in php.ini:
xdebug.remote_autostart = 0
and then adding the API key to the webservice URL that my webservice client calls:
?XDEBUG_SESSION_START=<your API key here>
and I'm not sure if this matters, but I entered the API key into my debugger (MacGDBp). Now the debugger fires up only when the webervice server-side script is called, not when the client is started.
Hope this helps.