In our project we are using Codeception to automate Chromedriver into running User Acceptance Tests. The issue we are facing is that the tests take a really long time to run. Even the most basic tests will take a minimum of 1 second.
Example:
public function testHomepage(Tester $i): void {
$i->amOnPage('/);
$i->see('Homepage');
}
Obviously I'm quite unhappy with the situation and want to speed this up. As we don't specify anything in our code: When is a page considered loaded by Chromedriver? Is it time-based? Or based on DOMContentLoaded or load?
Additional Information:
The Webdriver binding is provided by https://github.com/php-webdriver/php-webdriver
There are 3 types of Page Loading Strategies in Selenium: Normal, Eager and None.
By the default Selenium uses normal page loading strategy. In this case Selenium blocks the programm flow until page document reariness state "complete" is reached.
You can set the page loading strategy to "eager" to not wit for javascripts on the web page to complete loading etc.
You can read more about the Selenium Page Loading Strategy on this document page.
How to use it with PHP - I saw this example using this code:
$capabilities = DesiredCapabilities::chrome();
$capabilities->setCapability('pageLoadStrategy', 'eager');
$driver = RemoteWebDriver::create($host, $capabilities);
Related
I am working on a TYPO3 project where I have to dynamically disable caching based on a condition. It is a very specific usecase, that will not happen a lot.
I planned to use a USER_INT function, where I would perform the check and disable the cache if necessary. The USER_INT function works flawlessly, it is being called on every page load.
The thing is, I can not disable the cache, or at least I do not know how.
The code, I have right now:
page = PAGE
page {
typeNum = 0
adminPanelStyles = 0
11 = USER_INT
11.userFunc = [COMPANY_NAMESPACE]\PageHandler->checkCache
And in the function I perform the check:
public function checkCache($content,$conf){
global $TSFE;
$id = $TSFE->id;
if($this->checkIfDisableCache($id)){
//$TSFE->set_no_cache(); // <---- first I tried this one
$TSFE->no_cache=true; // <-----after a while I got despoerate and tried to disable it directly
}
}
I also tried to play with the config, it did not work.
The funny thing is, if I set it directly in typoscript:
config.no_cache = 1
it works, but since the check is rather complex, I want to use PHP to determine, if the cache should be disabled.
I know I am doing something wrong, I just don't know what. Any help would be appretiated :)
I don't think either of the previous answers really explain the situation. You have sort of a catch-22 here, in that your USER_INT is executed after the page cache entry has been generated. The way it works internally is everything that can be cached gets rendered first, and every USER_INT then outputs a marker in the HTML source which gets replaced afterwards. This way the cache can contain the version with markers and those can be rendered without having to render the whole page.
So what you need to do in this case if you want the page cache to be disabled only in some conditions, is to use a custom TypoScript condition that is capable of setting config.no_cache = 1 only under special circumstances. That way you prevent generating a cache entry if the condition is met, but preserve full caching and cached output for every other request.
https://docs.typo3.org/typo3cms/TyposcriptSyntaxReference/TypoScriptParserApi/CustomConditions/Index.html
Note that it is still recommended that you instead create the parts of your page that must not be cached, as USER_INT objects. Having a use case where you in some cases need to disable the entire page cache indicates a possible misunderstanding of how the caching framework and/or USER_INT works. Hopefully the above explains those parts a bit.
if you look at the pibase (AbstractPlugin) code you will see that probably setting $conf['useCacheHash']and $conf['no_cache'] should be done.
https://api.typo3.org/typo3cms/current/html/_abstract_plugin_8php_source.html#l00190
If you create this object as USER_INT, it will be rendered non-cached, outside the main page-rendering.
https://docs.typo3.org/typo3cms/TyposcriptReference/ContentObjects/UserAndUserInt/Index.html
I'm working on a Laravel 5.1 project, using a lot of ajax calls returning html blocks.
To optimize the speed of the website i want to implement private and public response caching. this works fine using following code:
return response()
->json($result)
->header('Cache-Control', 'public, max-age=300');
Yet using it this way wont hold in account objects that are updated within the 300 seconds.
Are there possibilities that allow me to clear the response cache of a request, if and only if the returning objects have been updated ?
Maybe you can try server side caching with something like this below. sorry this is crude
function sometest(User $user)
{
/** . . .conditions to check if some data has changed . . . **/
$jsonResponse = Cache::remember(Auth::id() . "_sometest", 300, function () use ($user)
{
$result = $user->all(); //get result here
return $result;
});
return response()->json($jsonResponse);
}
You can read about here Cache
you can also try
config caching: php artisan config:cache
route caching: php artisan route:cache
and utilizing memcached if you are able to.
As other said, client browser need a request to know that the data has been updated. Here's some solutions I would look into in your case:
Server-side cache (data still need to be network transferred):
depending on your environment, I would set up an Nginx + FastCGI cache using a "stale and update" policy. So the cache is always served (fast), and the cache is always refreshed. So only a few requests (one, or more depending on time to refresh cache) after code update are served with outdated data. This cache is URL-based, so if your content is cookie/session-based it can become tricky.
as #ZachRobichaud said, you can use the Laravel cache and set up a low cache retention time. Let's say 10s, which means the request will be outdated for 10s max after your content update. I'm not aware of a "stale and update" way on laravel, but it can be done with queues.
Client-side cache (no data transfer needed):
as I said, the client needs to know the data has been updated to invalidate the cache.
Usually for assets, we do "cache bursting" by adding GET parameters to the file URL. Like 'asset?version=1234with version changing for each deployment. As header cache is URL (and header) based, URL change will force network loading of files. Not tested with text/HTML content-type response, but worth a try if you can update URLs with a parameter you can change in.env` as example. Can be done dynamically if you have a CD/CI or thins you can trigger on deploy. In this case, you can "infinite" cache those as the "refresh" will be done by changing the URL parameter.
You can take a look at stale-while-revalidate Cache-Control header value that seems to work the same: always serve cache, and refresh cache if expired (also look at other parameters, can give you ideas). Careful about compatibility on this (no IE or Safari).
The Laravel Cache may be the fastest to implement and test, and see if the results suit you. It depends on the payload size also, if it's huge, browser cache is indeed better. If bandwidth is not the issue, it's mostly server response time: in this case, Laravel Cache would do the trick.
This might be my mistake somewhere, but anyway, I am using the fat free framework which has an inbuilt function to minify multiple css/js into a single file, and I thought that this would be good for optimization, but it turns out the opposite. If I keep the js files separately(and they are at the end of my html), the total size if added, comes to around 364kb, and seems to load in parallel within 1.5 secs. If I try to load the combined version however, the single file size becomes around 343kb, but takes around 10secs to load.
My minifying logic is a bit different though. First in the template I call a function to load the files:
<script type="text/javascript" src="{{ #BM->minify('js','js/',array(
'vendor/jQui/jquery-ui-1.10.4.custom.min.js',
'vendor/datatables/jquery.dataTables.min.js',
'vendor/bootstrap.min.js',
'vendor/smartmenus-0.9.5/jquery.smartmenus.min.js',
'vendor/smartmenus-0.9.5/addons/bootstrap/jquery.smartmenus.bootstrap.min.js',
'vendor/smartmenus-0.9.5/addons/keyboard/jquery.smartmenus.keyboard.min.js',
'plugins.js',
'main.js'
)) }}"></script>
The function sets the appropriate session variables and returns a path.
public function minify($type='',$folderpath='css/',$files=array()){
$filepaths = implode(",",$files);
$this->f3->set('SESSION.UI_'.$type,$this->themeRelFolder().'/'.$folderpath);
$this->f3->set('SESSION.ReplaceThemePath_'.$type,$this->themeRelFolder());
$this->f3->set('SESSION.m_'.$type,$filepaths);
return($this->f3->get('BASE').'/minify/'.$type);
}
The path maps to a controller which calls the minify method and spits out the actual minified content.
public function index($f3, $params) {
$f3->set('UI',$f3->get('SESSION.UI_'.$params['type']));
if($params['type']=='css'){
echo str_replace("<<themePath>>","../".$f3->get('SESSION.ReplaceThemePath_'.$params['type'])."/",\Web::instance()->minify($f3->get('SESSION.m_'.$params['type'])));
}else
{
echo \Web::instance()->minify($f3->get('SESSION.m_'.$params['type']));
}
}
I did it this way so that I can minify as many files as the template needed, and also be able to maintain file paths no matter what folder nesting structure inside a theme.
What am I doing wrong?
PS: I am testing this on my local wamp setup, not an actual server, so the load times are obviously different than a actual web server.
Seems like the engine is re-minifying every time. I'll bet you just need to setup caching - http://fatfreeframework.com/web#minify :
To get maximum performance, you can enable the F3 system caching and
F3 will use it to save/retrieve file(s) to minify and to save the
combined output as well. You can have a look at the Cache Engine User
Guide for more details.
http://fatfreeframework.com/quick-reference#cache :
Cache backend. F3 can handle Memcache module, APC, WinCache, XCache
and a filesystem-based cache.
For example: if you'd like to use the memcache module, a configuration
string is required, e.g. $f3->set('CACHE','memcache=localhost') (port
11211 by default) or $f3->set('CACHE','memcache=192.168.72.72:11212').
You're making it minify those files on the fly every single time a page is loaded. This, obviously, takes time.
Consider minifying once, then just linking to that one file.
I currently use Selenium 2 with a local Selenium web server and the PHP-Webdriver written by Facebook.
Now I want to write an automated test for the Facebook Like Button. Because this Button is loaded through an iframe, I first select this frame via $driver->frame(array('id' => 1)) (I Found out, that Facebook loads normally two frames and the second frame is the Like Button). After clicking the Like Button a new Frame is loaded, where the user also can send a comment to his wall. Unfortunately the focus is still on the Like Button Frame so that I have to switch to the second frame. How can I do this?
Because I do not use Selenium RC there is no Selenium.SelectFrame("relative=top") method. I also cannot use the method driver.switchTo().defaultContent() because I do not use the Java webdriver. It seems that I just can use methods specified in the JsonWireProtocol. How can I switch between frames or change the focus back to the top frame?
Python webdriver client has these methods for switching between iframes, windows and pop ups:
switch_to_active_element
switch_to_alert
switch_to_default_content
switch_to_frame
switch_to_window
switch_to_default_content is the one you need. Find its analog for php client.
UPDATE:
Since you mentioned JasonWireProtocol:
http://code.google.com/p/selenium/wiki/JsonWireProtocol#/session/:sessionId/frame
POST /session/:sessionId/frame
Change focus to another frame on the page.
If the frame ID is null, the server should switch to the page's default content.
I had this same problem. The issue I discovered, that the "frame" in the PHP webdrivers only switches to frames below the current one. So if you want to switch to a different frame that is above the current frame, you are out of luck. I had to select the main window, which essentially reset me to the top frame. From there I was able to select the correct frame.
//putting a sleep so the page can load
sleep(SLEEPTIME + 5);
//getting a list of windows on the page
$windows = $webdriver->getWindows(); // function below
//switching to the
$webdriver->selectWindow($windows[0]);
$webdriver->focusFrame('cpAppFrame');
public function getWindows() {
$request = $this->requestURL . "/window_handles";
$response = $this->execute_rest_request_GET($request);
return $this->extractValueFromJsonResponse($response);
}
This is the link to a project i was making http://shout.agilityhoster.com/login.html
Log in with
username: rafa
password: nadal
Now if I log in with another user
username: ana
password: ivanovic
then the website seems to run extraordinarily slowly. Could multiple timed javascript function calls be the reason? It works perfectly using xampp on my PC..
Thanks
you have multiple istances where you should have just one;
you are using inline javascript code where you can just use jquery;
you are using body onLoad where you should use jquery dom ready;
you are using multiple ajax POST where you should have only one and use json;
your first account is probably more faster then the second only cause of browser cache, note that local stuff are always more faster then online server depending on it's speed, and bandwidth.
hope this help
i want help you ;)
you have this:
$("#one").css("visibility","visible");
$("#onein").css("visibility","visible");
$("#closeaa").css("visibility","visible");
$("#onein").css("visibility","visible");
$("#Layer22").css("visibility","visible");
should be:
$(".ClassTheeseAll").css("visibility","visible");
or at least:
$("#Layer22,#onein,#Layer22,#closeaa").css("visibility","visible");
you have
<body onLoad="javascript:window.setInterval('open()', 1000000);checkrow();javascript:window.setInterval('check_newmsg()', 1000000)">
should be
$(function() {
setInterval('initAllMyStuff()', 1000000);
});
function initAllMyStuff() {
open();
checkrow();
check_newmsg();
}
function getmsgs()
{
$.post("getmsg.php",{'name':name_one},function(data){$("#one").html(data);} );
$.post("getmsg.php",{'name':name_two},function(data){$("#two").html(data);} );
$.post("getmsg.php",{'name':name_three},function(data){$("#three").html(data);} );
}
should be:
$.post("getmsg.php", { 'name_one' : name_one , 'name_two' : name_two , 'name_three' : name_three } , function(data) { /* loop json and store where needed */ });
you then have:
function open(){
jQuery(window).bind("beforeunload", function(){$.post("logout.php");})
$.post("online.php",function(data){
$("#Layer6").html(data);
});
unload should just be:
$(window).unload(function() {
$.post("logout.php");
});
to be continue...
You have on this page like 3-5 ajax request's per second, it can by sluggish at times just becouse of that.
Like Curtis pointed out try to use Network panel in firebug.
Have you tried watching the network traffic with Firebug? That would tell you how long each of your network requests takes, and when they happen.
Like Curtis's answer, you'll want to profile the page with firebug and yslow [ https://addons.mozilla.org/en-US/firefox/addon/5369/ ] . If you find that you're waiting for images and the like, you could potentially swap out the onLoad call with jquery's ready method:
http://api.jquery.com/ready/
That could create a perception of performance, even if the other page items aren't loading as quickly.
Terrible Javascript aside, notice that requests to the site, even for the initial login page, are spectacularly slow (for example the login page alone took about 10 seconds to load from here).
Also at some point I received this error from the site:
Web Server: Too many connections!
Check your server side code for performance problems. Make sure your MySQL queries are performant. Make sure the server itself is correctly configured... Of course you have no control of this if you're using shared hosting, and if you're on free hosting you'll just have to live with a terrible slow site.