Composer autoload performance [duplicate] - php

I want to know if unused use statements in my class affect performance of my php website?
Does php include all classes in beginning or when need it? If second choice then I think it doesn't affect performance of my system.
For Example: Use statement 'DbConnector' is not used
use model\adapter\DbConnector;

No, the use statement does not provoke the the class be loaded (it does not even trigger an autoloader).
It just declares a short name for a class. I assume the cost in terms of CPU and RAM is in the order of a few CPU cycles and a few bytes.

Newer versions of PHP, PHP 7 and especially PHP 7.2, are very good at optimizing code when it's complied into byte code. Unsed use statements are just stripped away by the compiler and will not even execute. Therefore it should not have any impact whatsoever. The compiler might use a few more CPU cycles while parsing the file but if you use OPCache there will be no effect on performance. The file will only be loaded when they are needed.

Related

PHP/Zend: Overhead of large number of unused variables and functions

I have a PHP-file that is included by a lot of other PHP-scripts, which all use only a subset of the functions and variables defined in that included file. (I guess this is the usual case for most larger libaries.)
For this reason, in most cases only a small part of the included file is actually used and most of it simply ignored (unused functions, never referenced variables, etc.).
But AFAIK all recent versions of PHP come with the Zend-optimizer, which as far as I understand it, produces some kind of bytecode that is then used at runtime. It therefore should filter out all unused code, so even a huge number of unused functions would cause zero overhead at runtime.
Is this the case or is there a performance overhead for using large libraries in PHP?
From the PHP 5.5 change log of new features:
The Zend Optimiser+ opcode cache has been added to PHP as the new
OPcache extension. OPcache improves PHP performance by storing
precompiled script bytecode in shared memory, thereby removing the
need for PHP to load and parse scripts on each request.
What I understand from that statement is that every .php file, when converted into bytecode, will be saved into shared memory so that the conversion does not need be repeated per file. As we are no longer performing that step our processing time goes down.
This means that the uncalled functions and un-needed variables get declared and stored in the cache but never used.
is there a performance overhead for using large libraries in PHP?
The answer to that is almost always "yes". There have been numerous benchmarks that say that a library is slow, even when using OPCaching (such as APC or Zend Optimiser).

Does including PHP files that contain functions in it slow the pages with those included even if not being used?

That's basically all my question is, if I have php pages that have 5,000-10,000 lines of code for a certain purpose, in my case image upload management (cropping and such), would it slow down the rest of my documents to include them on each page that doesn't use them? Basic logic tells me it of course would, but at the same time I'm not an expert, so I don't know if php acts differently than I may understand.
include and require statements makes PHP also compile/interpret the files that you include. That does cost some computation but in 99 % of cases it won't matter... unless your site is very popular and saving that computation time is important. If that is the case, you can solve this very easily by using so called PHP Accelerators (like XCache or APC). These can be installed along with your PHP installation and cache in RAM all the compiled opcode from your php scripts. Improvements with this solution vary between 40 and 75 %.
There will be a slight slowdown as the unused functions (extra code) needs to be parsed and it would also take extra memory. Apart from that no other effect.

Compilation of PHP- to op-code and having the opcode executed

PHP is usually compiled to opcode by the Zend engine on execution time.
To skip the compiling every time one can use an opcode cache like APC to save the opcode in shared memory and reuse it.
Okay, now it seems that there is no solution yet for just compiling the PHP to opcode and using that. Similar to how you use Java.
But why? I am wondering about that b/c this is a quite obvious idea, so I guess there is a reason for this.
EDIT:
the core question is this:
wouldn't make PHP-compilation make opcode-caching superfluous?
The only "reason" against it would be that you couldn't just fix something on the live-system ... which is anyway bad bad bad practice.
You've given one reason against it.
Another very important one is that if you separate the compile from the runtime both in terms of the time at which each occur but also in terms of the hardware where it runs, you quickly run into complex dependency problems - what happens when you try to run opcode generated by PHP 5.1 on a PHP 5.3 runtime?
It also makes debugging of code harder - since the debugger has to map the opcode back to the source code.
But a very important question you don't seem to have asked let alone answered is what is the benefit of pre-generating the opcode?
Would compiling the opcode prior to runtime have a significant benefit over caching the opcode? The difference would be un-measurably small.
Certainly the raison d'etre for HipHop is that natively compiled PHP code runs faster than PHP with opcode caching at the expense of some functionality. But that's something quite different.
Do you think that having only the opcodes on the server improves the security (by obscurity)?

PHP Includes and Memory

I hope this is not a completely stupid question. I have searched quite a bit for an answer, but I can't find (or recognise) one exactly on point.
I understand that functions in PHP are not parsed until actually run. Therefore, if I have a large class with many functions, only one of which requires a large include file, will I potentially save memory if I only include the "include file" within the function (as opposed to at the top of the class file)?
I presume that, even if this would save memory, it would only do so until such time as the function was called, after which the memory would not be released until the current script stopped running?
Many Thanks,
Rob
I love this saying: "Make it work and then, if needed, make it fast." -some good programmer?
In most cases you would probably be better off focusing on good OOP structure and application design then speed. If you server is using something like Zend Optimizer having all your methods in a single file won't make any difference since it is all pre-compiled and stored in memory.(It's more complicated then this but you get the idea)
You can also load all your include files when apache starts. Then all the functions are loaded in memory. You wouldn't want to do that while developing unless you like to restart Apache every time you make a code change. But when done on production servers it can make a huge difference. And if you really want to make things fast you can write the code in C++ and load it as a module for Apache.
But in the end... do you really need that speed?
Yes it will, but be sure that the function doesn't depend on any other functions included in the parent. The memory consumption is also dependent on a couple things, from the size of the file itself to the amount of virtual memory it requires with variable setting and proper garbage collection protocols.
If the function is inside a class, it's called a method, and it might depend on its class to extend another class.
Just some things to consider. Always include the bare minimum.
Don't save memory on such cases unless you really need it, save development time. Memory is usually cheap but development/supoort time isn't. Use php opcode cacher like eAccelerator or APC, it will increase speed of execution because all files will be pre-compiled and stored in memory.

Question of PHP cache vs compile

from my understanding, if you use a PHP caching program like APC, eAccelerator, etc. then opcodes will be stored in memory for faster execution upon subsequent requests. My question is, why wouldn't it ALWAYS be better/faster to compile your scripts, assuming you're using a compiler like phc or even HPHP (although I know they have issues with dynamic constructs)? Why bother storing opcodes since they have to be re-read by the Zend Engine, which uses C functions to execute it, when you can just compile and skip that step?
You cannot simply compile to c and have your php script execute the same way. HPHP does real compilation, but it doesn't support the whole superset of php features.
Other compilers actually just embed a php interpreter in the binary so you aren't really compiling the code anyway.
PHP is not meant to be compiled. opcode caching is very fast and good enough for 99% of applications out there. If you have facebook level of traffic, and you have already optimized your back end db, compilation might be the only way to increase performance.
PHP is not a thin layer to the std c library.
If PHP didn't have eval(), it probably would be possible to do a straight PHP->compiled binary translation with (relative) ease. But since PHP can itself dynamically build/execute scripts on the fly via eval(), it's not possible to do a full-on binary. Any binary would necessarily have to contain the entirety of PHP because the compiler would have no idea what your dynamic code could do. You'd go from a small 1 or 2k script into a massive multi-megabyte binary.

Categories