When building some of my PHP apps, a lot of the functionality could be coded using PEAR/PECL modules, however, the fact that some people using it may not have the access to install things, It poses a puzzler for me.
Should I forsake some users to use PEAR/PECL for functionality, where these will allow me to have a system coded up quicker than if I wrote my own functionality, but eans that it will exclude certain people from using it.
It partly depends on how much time you have, and the purpose of the project. If you're just trying to make something that works, go with PEAR/PECL. If you're trying to learn to be a better programmer, and you have the time, then I'd recommend taking the effort to write your own versions. Once you understand the innards of whatever you're trying to replace, you may want to switch to the PEAR/PECL version so that you're not wasting time reimplementing what has already been implemented...
...but on the other hand, preexisting tools don't always do exactly what you need, and sometimes have overhead that doesn't do you any good. This is why Unix command-line tools are so small and narrow of purpose; nobody really needs a version of 'ls' that can do anything besides what 'ls' can currently do. Your version of whatever PEAR library will, by virtue of being written by you, do exactly what you need doing. It requires some careful thought...
...but on the gripping hand, don't spend too much time thinking about it. Spend five minutes, make a decision, and start coding. Even if you make the wrong decision, you'll at least have gotten more practice coding. :-)
Save on development time by developing with the pear libraries, and provide the libraries bundled in what you distribute (though you'll have to make sure it obeys licensing requirements)
I would not depend on certain PECL extensions being installed unless you're doing something particularly related to one (say an XDebug web-frontend or something), the majority of installs will be carrying a fairly vanilla set of extensions.
My suggestion is to start with assuming PEAR/PECL modules, and get the rest of the code done. Then, once you've got most of your code working the way you want, you can evaluate going back and piece by piece replacing the outside code with your own. Plus, by then you'll have a better idea of the impact using those has on your userbase.
Code it initially using PEAR/PECL and if you get people asking for a non PEAR/PECL version, start coding your own alternatives then for such a version.
The initial development will go much faster with this, and you may find that no-one cares about requiring 3rd party libraries once you have started releasing apps.
Use PEAR but allow for including the PEAR packages inside your project. All PEAR packages can be separately downloaded from http://pear.php.net/ and can be put anywhere. Depending on convenience and licensing issues you could then package all the required PEAR files with your project or tell users how to download and "install" them.
What I do most times is I'll never use PEAR installed globally on a server. Versions can change and affect your application.. Instead I have a config file (in my case XML) that lists all the packages required and their versions. The installer connects to my personal FTP repository and downloads and installs all the PEAR packages locally in $PROJECTBASE/lib/pear/ .. And PEAR is run locally instead of globally. Something you may want to consider.
Using PEAR is no problem, if users do not have root access to their webserver, they can simply download the PHP files from pear.php.net and add it to their include path. PECL's a little more tricky to work around, since there's often no way to install new modules without root access.
You need to watch out because a lot of modules in pear are really of pretty low quality.
Some are great, don't get me wrong, but don't assume that anything in pear, by virtue of being in pear, is at any given quality. Which means you need to at least skim the source of a pear module before deciding to use it, which for simple enough tasks may take more time than going without pear.
pecl is different, however. Extensions tend to be better vetted and tested, else they'd crash php.
Reiterating much of what's already been said: http://www.codinghorror.com/blog/archives/001145.html
Related
I've been searching the internet for over 3 days now and cannot find anything that has a clear explanation of how to install ffmpeg for php. I currently have ffmpeg installed correctly on my Mac command line but as to how to use it in my php scripts is beyond me. How do I install ffmpeg for php? Any help would be great. Thanks.
ffmpeg is a set of executable programs and not an API. There was a project I was aware of in the past that built a php extension, but it was not robust and never really emerged as viable for a number of reasons, not the least of which is that ffmpeg has a lot of different options and builds.
Several years ago I was tasked with building an audio and video encoding system for a social network startup, utilizing PHP as the middleware, so I've been through this exact exercise.
One of the most challenging aspects was coming up with a working compiled version of ffmpeg with all the encoders we wanted to have. In our case our hosted environment was AWS and we were using Amazon linux servers, so there were a few hiccups along the way, and patches I had to chase down. There were packages available that had ffmpeg, but they were hopelessly outdated and missing key features we needed. The only way to get things working was to get the ffmpeg source and compile it, along with the source for the various codecs we were using, primarily to get mpeg4 video and compatible audio. If you aren't comfortable doing this, you will probably not be able to get things working.
In regards to the PHP side of it, I ended up using the PHP-FFMpeg library suggested in the comment above, but I did fork it and made a lot of customizations that worked for us, but were not really contributable back upstream. Subsequently, the maintainer of the library has addressed many of the issues I had and it is a much more robust library now that should save you a lot of problems if you were to try and create your own wrapper.
In summary your server needs:
A working compiled version of ffmpeg and its associated helper
programs which may be of need depending on what you're going to be
doing with media you are producing. For example, there is a separate media introspection program (ffprobe) that is used to determine the characteristics of media you want to encode.
A PSR-0/Composer compatible project. Ours was built on top of Symfony 2.x but that isn't a requirement. I did want to mention it as the project has really pushed the improvement and stability of the symfony component that wraps the php 'exec' function at the heart of any effort to call an external program.
Following the instructions and reading through the API you should be able to get a sample encoding to work with PHP, but keep in mind that ffmpeg works with files, and there are lots of file related issues you have to think through (original files, rendered files and naming, temporary file locations) all of which you'll have to deal with unless you're doing something trivial. In our case these programs were async command line/batch oriented and there was a lot of time and effort that needed to go into figuring out a way to scale and be performant. Needless to say, encoding video can take a lot of time, and is not something you want to do in a monolithic php script where the end user uploads and then waits while you do all the processing in the same script!
I know you are trying to do this on your Mac. Is this really the target environment for your production deployment? This is finicky and platform dependent enough of a process that I don't think it's advisable to try and get a hacky version on your Mac, because the process of getting ffmpeg, and the exact version and components is highly variable and extremely important to your success.
I have a LAMP (Linux/Apache/MySQL/Php) application that I should release soon.
Even if I've never used it, I'm thinking about using autotools for it, to make the configuration and installation process easier (for the customer and for me, in the future).
Have you ever done (or thought) such a thing? Are there any drawbacks? Does it make a bit of sense?
Autotools is used mostly when you are trying to compile your programs for multiple target platforms. This applies for C code in general and checks stuff like available libs, size of data types, libc functions etc. So unless your program is written in C and you have a need for supporting all kinds of Unix flavors, dont bother with autotools.
If you are trying to build some kind of installation program for Linux, I suggest you look into rpmbuild (for redhat distros). Rpmbuild is easy to use if all you are doing is packaging files for easier distribution. A good tutorial is available here. One great aspect of rpmbuild is that you can specify requirements on the target system, for example: apache, mysql and even specific php-modules that you need.
For configuration and deployment, you can have a look at ant.
In my previous employment, we were using ant for deployment/configuration of a mix of perl, php, xml, xsl, unit test , Apache config ...
You have a build.properties file where you can put some default values and the customer will jsut have to create a local.properties where its values will overwrite the one from build.properties.
Also if you need to launch some scripts that are parts of the setup, you can also do that with ant.
simple idea
I may be stating obvious, but wouldn't it be easier for the sake of it, just to use
phpinfo();
?
From it you may mostly read everything - server version, PHP version, MySQL version and running PHP extension, compare it against what you need and advice to your client or their hoster that "I need this and that installed".
I am developing a php application which my customers will download and install on their own servers. I know the base requirements for my application (like min. php version) but is there a way to generate a list of requirements that needed to run my application on windows or unix systems?
Thanks.
You mean, generate a list of requirements based on an analysis of your source code?
While in theory, that might be possible, I don't think such a solution exists. I think there is no way than analyzing your code by hand, with the PHP manual very close by.
Do you use GD? Then you need PHP with the GD module. Do you need to create GIF images with GD? Then you need GD, but not between versions 1.6 and (I think) 1.8. Do you use PDO? Then you need PHP > 5.1.0. And so on and so on.
In short, I'm afraid think this is going to be a manual process. Manual also as in "PHP manual" - the User Contributed Notes to each function and method are a gem, and any common cross-platform problems are usually noted there somewhere.
While you can trust that PHP x.y.z has a defined set of functions and behaviour, be sure to test well before you declare something suitable to run on a different server. IIS's support of PHP is way better now, I'm told, but the last time a ported a big PHP application over to IIS, it took me three days to work around all the mysterious bugs.
Just be aware of what you are using. For example, you should clearly communicate if you need something like .. a special database binding ( other then mysql ), xml libraries etc.., or even better, create an installer that is bundled with your software that checks that kind of stuff.
Other than that, there should be no problems concerning different servers ( apache / iis / fastcgi.. ). So to answer your question: you have to generate that list all by yourself.
As others have said, you'll need to manually keep track of special libraries and functions you're using. If you need PHP4 compatibility then you won't be able to use the built-in XML libraries for example. You can also check the list of functions added to PHP 5.
One thing I would recommend is installing WampServer if you have access to a Windows machine. Aside from being good for local development, you can download modules for most Apache/PHP/MySQL versions and test combinations.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have absolutely no idea about version control. Only that it can be very useful in many ways.
I have found a few related questions but none that start from the absolute beginning.
I am the only developer at my work using Mac OS X and traditionally just been using FTP.
Can anyone help me with version control in relation to PHP projects (does it matter)?
Yes, try it out, it's worth it. And the language you are using doesn't matter. It's working great with PHP for me and it will for you too.
Benefits
If you are the only developer, it is indeed easier to go without version control. However, you will find great benefits to using a version control system. Some of the easiest benefits will be:
Never wondering what is your latest version once you go back to a project (no more myproject090201-archive2-final6.zip)
Never fear to start off some major refactoring, if you make a mistake on your file, you'll just rollback to the latest version
If something stops working in your project and you have the feeling it worked at one point, you can test some of the prior versions easily and look at the difference between the working version and the non-working version to find what broke the code
Additional backup of your current project, and even better if it's not on your machine... of course, additional points for backing up your version control system, we're never too cautious, you don't want to have to restart that month-long project do you?
Choices
As some have said, you have a few choices for your version control system and I guess you'll want a free one to begin. There are a few excellent commercial products but the free ones have nothing to be ashamed of. So here are some very popular free version control systems:
Subversion (also called SVN)
Git
Mercurial
Bazaar
Centralized versus distributed
Subversion has been there for a while and it's one classified as 'centralized'. Meaning everyone will always go fetch the latest version and commit their latest work to one central system, often on another system although it can easily be on your own machine. It's a process easy to understand.
The three others are called 'distributed'. There's a lot of different possible processes as it's a more flexible system and that's why those three newcomers are getting a lot of traction these days in open source projects where a lot of people are interacting with one another. Basically you are working with your own revisions on your own machine, making as many copies as you need and deciding which versions you share with other people on other computers.
The trend definitely seems go towards distributed system but as those systems are more recent, they are still missing the GUI tools that makes it really user friendly to use and you might sometimes find the documentation to be a bit lighter. On the other hand, this all seems to be getting corrected quickly.
In your case, as you are working alone, it probably won't make a big difference, and although you'll hear very good points for centralized and distributed systems, you'll be able to work with one or the other without any problems.
Tools
If you absolutely need a GUI tool for your Mac, I'd then choose SVN to get initiated to source control. There are two very good products for that (commercial):
Versions
Cornerstone
And a few other ones (such as the free svnX) that are becoming a little bit old and unfriendly in my opinion but that might be interesting trying anyway.
If you don't mind not using the GUI tools, with the help of Terminal you'll be able to do all the same things with a few simple command lines with any of the aforementioned systems.
Starting points
In any cases, you'll want some starting points.
For Subversion, your first stop must be their free book, Version Control with Subversion. Take a few hours of your day to go through the chapters, it'll be time well invested. The introduction chapters are a good read even you don't want to use Subversion specifically because it'll get you to understand version control a little bit better.
For a distributed system, I've had fun with Mercurial but it's an easily flammable subject so I'll let you make your own choice there. But if you end up looking at Mercurial, have a look at this blog post, it was an excellent starter for me that'll get you up and running with the basics in a few minutes if you're already a bit accustomed to version control in general. Anyway, drop by Mercurial's homepage and have a look at the Getting Started section of the page.
Conclusion
Give it a go, invest a day trying it out with a few bogus files. Try out renaming files and directory, erasing, moving things around, committing binary files versus text files, resolving conflicts and reverting to older versions to get a hang of it. These are often the first few hurdles you'll encounter when playing with version control and it'll be painless if it's on a non-production project.
In any cases, it's something well-worth learning that'll be helpful with your solo projects as well as if you end up working with other developers at your current job or your next one.
Good luck!
The type of code is irrelevant.
One open-source and popular version control system is Subversion and there is a very good overview/manual here.
I use Git for PHP development.
It's fast, flexible, reliable, clean (CVS and SVN create a lot of hidden folders that I personally don't like).
Its distributed nature allow to work the way you want (with or without a central repository).
You can find more about it here:
Gitmagic
Speed Benchmarks
Moreover, you can get the Eclipse PDT (PHP Plugin) and use Subclibse in the IDE.
Versions is working well for another developer I work with. Additionally, if you are using Textmate the SVN bundle provides pretty much all you need for most parts of the Subversion workflow. I really like it.
The Project Plus plugin takes it a step further by adding small unobtrusive badges to versioned files in the project tree, showing at a glance the state of files in a project.
If you're on a Mac, do yourself a favor and pick up Versions, a beautifully designed (and highly functional) Subversion GUI. You'd do best to learn the terminology and get an idea of how Subversion works using a GUI before you jump to the command line. Once you're able to commit revisions of your code and run updates to get other people's work, then go back and read the red bean book (it really is the best way to learn Subversion in-and-out).
http://versionsapp.com/
use bazaar http://bazaar-vcs.org/
it's very nice and you can start using it in minutes.
Check out other options too - Miscrosoft's TFS (this not only used for source control system but for defect tracking, project management etc etc) , Bazaar, Git are popular ones.
Alex,
Version control (and some will scathe me for this statement) is not a trivial matter, and even very experienced developers get themselves into trouble. The most frequent causes for frustration are limitations of a particular product (Visual Source Safe is a famous one), and members of a team not following the same process, or not understanding the process at all.
This should not stop you from looking into using a source control tool - the opposite is the case. You can only use a tool effectively if you understand what it does and why.
I would recommend that you look into CVS. It has been around for many years, it is relatively simple to install, set up, and use, and while there are GUI clients available for most platforms, learning it from the command line may provide the best access to its features.
What can I do to increase the performance/speed of my PHP scripts without installing software on my servers?
Profile. Profile. Profile. I'm not sure if there is anything out there for PHP, but it should be simple to write a little tool to insert profiling information in your code. You will want to profile function times and SQL query times.
So where you have a function:
function foo($stuff) {
...
return ...;
}
I would change it to:
function foo($stuff) {
trace_push_fn('foo');
...
trace_pop_fn('foo');
return ...;
}
(This is one of those cases where multiple returns in a function become a hinderance.)
And SQL:
function bar($stuff) {
trace_push_fn('bar');
$query = ...;
trace_push_sql($query);
mysql_query($query);
trace_pop_sql($query);
trace_pop_fn('bar');
return ...;
}
In the end, you can generate a full trace of the program execution and use all sorts of techniques to identify your bottlenecks.
One reasonable technique that can easily be pulled off the shelf is caching. A vast amount of time tends to go into generating resources for clients that are common between requests (and even across clients); eliminating this runtime work can lead to dramatic speed increases. You can dump the generated resource (or resource fragment) into a file outside the web tree, and then read it back in when needed. Obviously, some profiling will be needed to ensure this is actually faster than regeneration - forcing the web server back to disk regularly can be detrimental, so the resource really does need to have heavy reuse.
You might also be surprised how much time is spent inside badly written database queries; time common generated queries and see if they can be rewritten. The amount of time spent executing actual PHP code is generally pretty limited, unless you're using some sub-optimal algorithms.
Neither of these are limited to PHP, though some of the PHP "magicy" approaches/functions can over-protect one from thinking about these concerns. For example, I recently updated a script that was using array_search to use a binary search over a sorted array, and gained the expected exponential speedup.
Really consider using XDebug profiler: it helps with checking how much a certain function is being executed against what you would have expected.
I try to decrease instructions while improving code readability by replacing logic with array-lookups when appropriate.
It's what Jeff Atwood wrote in [The Best Code is No Code At All][1].
Also, avoid loops inside another
loop, and nested if/else statements.
Short functions. Sometimes a lot of
code does not need to be executed
when the result-value is already
known.
Unnecessary testing:
if (count($array) === 0) return;
can also be written as:
if (! $array) return;
Another function-call eliminated!
[1]: http://www.codinghorror.com/blog/archives/000878.html"The Best Code is No Code At All"
You can optimized the code with two basic things:
Optimizing PHP associated library and server
Go through https://www.digitalocean.com/community/articles/how-to-optimize-apache-web-server-performance Or
You can use profiling tool like xhprof to view what part of your code can by optimized and here is the link to follow: http://michaelsanford.com/compiling-xhprof-for-php-5-4/
Optimizing your code using code profiler and code analyzer
You need to install Netbeans in order to use this plugin.
Here are the steps you need to follow:
1) Open NetBeans then select option from menu bar Tools > Plugin. Then search plug-in name "phpcsmd" in the available plug-in tab and install it from there.
2) Now open the terminal and be there as the super user by typing command "sudo su".
3) Install PEAR library (if it is not installed) into your system by running following commands into your terminal
a) wget http://pear.php.net/go-pear.phar
b) php go-pear.phar
As we need this for the installation of further addons.
4) Then run the command
"pear config-set auto_discover 1"
This will be used to set auto discover the path "true" for the required plug-ins. So they get install to the desired location automatically.
5) Then run below command to install PHP code sniffer.
"pear install --alldeps pear/PHP_CodeSniffer"
6) Now to install the PHP Mess Detector by running following command
"pear install --alldeps phpmd/PHP_PMD"
If you get the error like "invalid package name/package file "phpmd/PHP_PMD"" while installing this module. You need to use this "pear channel-discover pear.phpmd.org" command to get rid of this error. After this command you can run the above command again to install Mess detector.
7) Now to install the PHP Depend by running following command
"pear install --alldeps pdepend/PHP_Depend"
8) Now install the PHP Copy Paste Detector by running following command
"pear install --alldeps phpunit/phpcpd"
9) Then run the command
"pear config-set auto_discover 0"
This will be used to set auto discover path "false".
10) Then open net beans and follow the path Tools>Options>PHP>PHPCSMD
There is no magic solution, and attempting to provide generic solutions could well just be a waste of time.
Where are your bottlenecks? For example are your scripts processor/database/memory intensive?
Have you performed any profiling?
including files is slow, and requiring them is even slower. If you use __autoload for including every class then that will add up. for example.
I'm always a bit wary of trying to be too clever in terms of code optimisation, if it sacrifices code clairty. If you need to make code obscure to make it fast, would it not be cheaper to upgrade hardwear instead of wasting your time trying to tweak code? Processor cycles are cheaper than programmer cycles, after all.
The ones I can think of...
Loop invariants are always a
good one to watch.
Write E_STRICT and E_NOTICE
compliant code, particularly if you
are logging errors.
Avoid the # operator.
Absolute paths for requires and
includes.
Use strpos, str_replace etc. instead of regular expressions whenever possible.
Then there's a bunch of other methods that might work, but probably wont give you much benefit.
Whenever I look at performance problems, I think the best thing to do is time how long your pages take to run, and then look at the slowest ones. When you get these real metrics, you can often improve performance on the slowest ones by orders of magnitude, either by fixing a slow SQL query or perhaps tightening up the code a bit.
This of course requires no new hardware or special software, just a critical eye on the existing code.
That said, this will only work for so long... if you really are getting enough traffic to hit the limits of your hardware, and/or there is some code that is just inherently slow and really required, you will have to look at other possibilities.
I'm responsible for a large reporting system and we track the slowest reports kind of like that. I fire a unique key into the db when the report starts and then when it finishes I can determine how long it took. I'm using the database because that way I can detect when pages timeout (which happens a lot more than I'd like)
Follow some of the other advice first like profiling and making good resource allocation decisions, e.g. caching.
Also, take into account the performance of outside resources like your database. In MySQL you can check the slow query log for example. In addition make sure you didn't design your database an forget about it. Optimizing your queries (again for MySQL) against real data can pay of big.
Rasmus Lerdorf gave some good tips in his recent presentation "Simple is Hard" at FrOSCon '08. If you are using a bytecode cache (and you really should be using one), include path misses hurt a lot, so optimize your require/require_once.
You can use profiling tool like xhprof to view what part of your code can by optimized !
1) Use latest version of PHP
The core team is working hard on improving the performance of PHP in every version.
2) Use a bytecode cache
Since PHP 5.5 a bytecode cache has been added to PHP named OPcache. Using OPcache can make a huge difference especially since PHP 7. It receives improvements in every PHP version and might even get a JIT implementation in the future.
3) Profiling
While developing profiling gives you great insight what exactly is happening. This helps a lot finding bottlenecks in your code.
One of the most used tools is XHProf but is not officially supported anymore and has issues with PHP >= 7. An alternative when you want to profile PHP >= 7 is Tideways which is a fork of XHProf.