Use PHP script for "click" on link - php

I am solving an issue that I cannot figure out. I need to login and click repeteadly (thousand of times) on the specific link to create some load. I know I should use HTTP/PHP for that but do not know how to solve the login?
My next idea was to use some packet sniffer and re-send the requests..but will it work when I would close the browser?
Thanks

If you'd want to do this from a PHP script, you could use cURL to simulate the HTTP request. However, it is easier to use a benchmark tool like ab or siege to put load on certain URLs
With ab:
ab -n 1000 -c 10 http://www.example.com/login.php
From the Siege website:
Siege is a regression test and benchmark utility. It can stress test a single URL with a user defined number of simulated users, or it can read many URLs into memory and stress them simultaneously. The program reports the total number of hits recorded, bytes transferred, response time, concurrency, and return status. Siege supports HTTP/1.0 and 1.1 protocols, GET and POST directives, cookies, transaction logging, and basic authentication. Its features are configurable on a per user basis.
Also see this related question

Take a look at Selenium. You may record an action and then edit the resulting script into a loop.

Related

php doing curl in loop will slow down server?

If I have a loop with a lot of curl executions happening, will that slow down the server that is running that process? I realize that when this process runs, and I open a new tab to access some other page on the website, it doesn't load until this curl process that's happening finishes, is there a way for this process to run without interfering with the performance of the site?
For example this is what I'm doing:
foreach ($chs as $ch) {
$content = curl_exec($ch);
... do random stuff...
}
I know I can do multi curl, but for the purposes of what I'm doing, I need to do it like this.
Edit:
Okay, maybe this might change things a bit but I actually want this process to run using WordPress cron. If this is running as a WordPress "cron", would it hinder the page performance of the WordPress site? So in essence, if the process is running, and people try to access the site, will they be lagged up?
The curl requests are not asynchronous so using curl like that, any code after that loop will have to wait to execute until after the curl requests have each finished in turn.
curl_multi_init is PHP's fix for this issue. You mentioned you need to do it the way you are, but is there a way you can refactor to use that?
http://php.net/manual/en/function.curl-multi-init.php
As an alternate, this library is really good for this purpose too: https://github.com/petewarden/ParallelCurl
Not likely unless you use a strictly 1-thread server for development. Different requests are eg in Apache handled by workers (which depending on your exact setup can be either threads or separate processes) and all these workers run independently.
The effect you're seeing is caused by your browser and not by the server. It is suggested in rfc 2616 that a client only opens a limited number of parallel connections to a server:
Clients that use persistent connections SHOULD limit the number of
simultaneous connections that they maintain to a given server. A
single-user client SHOULD NOT maintain more than 2 connections with
any server or proxy.
btw, the standard usage of capitalized keywords like here SHOULD and SHOULD NOT is explained in rfc 2119
and that's what eg Firefox and probably other browsers also use as their defaults. By opening more tabs you quickly exhaust these parallel open channels, and that's what causes the wait.
EDIT: but after reading #earl3s 'reply I realize that there's more to it: earl3s addresses the performance within each page request (and thus the server's "performance" as experienced by the individual user), which can in fact be sped up by parallelizing curl requests. But at the cost of creating more than one simultaneous link to the system(s) you're querying... And that's where rfc2616's recommendation comes back into play: unless the backend systems delivering the content are under your control you should think twice before paralleling your curl requests, as each page hit on your system will hit the backend system with n simultaneous hits...
EDIT2: to answer OP's clarification: no (for the same reason I explained in the first paragraph - the "cron" job will be running in another worker than those serving your users), and if you don't overdo it, ie, don't go wild on parallel threads, you can even mildly parallelize the outgoing requests. But the latter more to be a good neighbour than because of fear to met down your own server.
I just tested it and it looks like the multi curl process running on WP's "cron" made no noticeable negative impact on the site's performance. I was able to load multiple other pages with no terrible lag on the site while the site was running the multi curl process. So looks like it's okay. And I also made sure that there is locking so that this process doesn't get scheduled multiple times. And besides, this process will only run once a day in U.S. low-peak hours. Thanks.

Need to send asynchronous url request using php and know, the time required

I have a database in the cloud, i need to know, at what time and the number of requests will the server crashes down, so I have thought of sending asynchronous requests using php and then find the time needed for serving each of it. I am bit confused as in how to proceed, I am not sure, if cURL will be useful here. Just a layout of how to proceed will be helpful.
ab -n 1000 -c 10 http://yourserver.com/
-n number of requests
-c concurrency
There other tools to benchmark server
ab is a part of apache tools
Use siege or Apache benchmark tool to load test your server by calling single or multiple urls, you can increase concurrency, volume of the requests to the server. siege will give you detail report of the requests and concurrency and how is your server performing, you even call your single server from multiple other servers.
It means that server is heavly loaded with the request i.e, all the threads are busy serving the request.
Solution : either increase the maxThread attribute count for connector in server.xml file or increase acceptCount attribute value.
acceptcount : The maximum queue length for incoming connection requests when all possible request processing threads are in use. Any requests received when the queue is full will be refused.

push and pull technologies using Ajax or Socket

I have a website that needs to send notifications to the online clients at real time same as Facebook, after more googling, I found a lot of documentation about push and pull technology. I found from this documentation ways for implementing them using Ajax or Sockets. I need to know what is the best to use in my case and how is it coded using javascript or jquery and php.
I cannot say you what's the best use in your case without knowing your case in detail.
In most cases it is enough to have the clients check with the server every one or two seconds, asking if something new has happened. I prefer this over sockets most of the time because it works on every web server without any configuration changes and in any browser supporting AJAX, even old ones.
If you have few clients (because every client requires an open socket on the server) and you want real realtime, you can use websockets. There are several PHP implementations, for example this one: http://code.google.com/p/phpwebsocket/
If you can ensure that there will be only single browser open per logged in user then you can apply this long polling technique easily.
Policy for Ajax Call:
Do not make request every 2 seconds.
But wait and make request only after 2 seconds of getting response from previous request.
If a request does not respond within 12 seconds then do not wait send a fresh request. This is connection lost case.
Policy for server response:
if there is update response immediately. to check if there is update rely on session ; (better if you could send some hint from client side like latest message received; this second update checking mechanism will eliminate the restriction of single browser open as mentioned above)
otherwise sleep() for 1 second; (do not use infinite loop but use sleep) and then check whether there is update; if update is there respond; if not sleep again for 1 second; repeat this until total 10 seconds has elapsed and then respond back with no update
If you apply this policy (commonly known as long polling), you will find processor usage reduced from 95% to 4% under heavy load case.
Hope this explains. Best of luck.
Just use apply the long-polling technique using jQuery.
Sockets are not yet supported everywhere and also you would need to open a listening socket on the server for this to work.

Stress testing a webpage

I have a web based phone dialer, which I need to stress test. It requires human action to terminate a call and dial next call. I need to simulate a situation under which 100 users will use the service concurrently. I am not allowed to modify the javascript which dials the next number. Also, there exist a login page, after which the users can reach the dial pad.
Any idea how do I do this?
You can use Apache JMeter to stress test your web app. First setup JMeter as proxy to record the http transactions. then using those transactions as template set it up to send 100 concurrent request.
Maybe xdotool could be a good beginning to resolve your human interaction simulation. But how to solve the 100 users concurrently, I don't know yet. Hopes this helps.

Testing How Code Scales

I am currently working on some Ajax heavy code and I am wondering how my server will scale as more and more users (hopefully) start to use my web app. It is only on my internal test server for the moment and I was wondering how I would go about simulating a few hundred or thousand users so that I can see how it handles a heavier load. It is written in PHP/MySQL and I really didn't want to find hundreds of computers to set up and test manually :) Thanks in advance, any advice or direction is much appreciated.
Apache Benchmark. It ships with Apache Web Server. Works kinda like this: For 100 requests using 10 concurrent threads, use this command:
ab -n 100 -c 10 http://localhost/
Replace the localhost url with a url that your AJAX code will be calling. The output will give you a nice report on how the requests were processed. Some of the interesting numbers are:
Complete requests
Failed requests
Requests per second
What I have done, which will give a very rough idea, is to write an application that can create tens or hundreds of threads, and randomly hit the server dozens of times each, with unique users.
So, I had 100 unique test users, and I would create 100 threads and just randomly keep doing some operations at random intervals.
This will tell if you have some scale problems.
Ultimately though, you will have a problem, as you only have one network connection, so that will throttle the speed some, so it isn't perfect.
You can also use junitperf to help with this, as you can then look at whether each test is taking too long to respond.
The best bet is to take these tests, put them on as many machines as you can, and run them at one time, to have 10 computers each pretending to be 10 or 100 people is more effective than having one computer pretend to be each of these people.
You won't want to have the webserver and tests be on the same computer, otherwise that will seriously mess up the results since the webserver will be getting only some of the cpu cycles.
Given this problem, I would log a user session. Then write client code that reproduces that set of calls.
Another option would be to use VMware. (note, I'm an employee) Just create a bunch of VMs and connect to your site with each of them. Still means a lot of manual work, but at least you don't need as many machines to do the testing.
I recommend LoadUI http://www.loadui.org/

Categories