I have two servers on PHP that need to communicate, calling remote methods of each other. XML-RPC for PHP is not very good solution because it's very slow, and I can't call multiple methods in one request.
Try MultiRpc - multiple methods calling in 1 request, by encrypted and compressed protocol.
XML-RPC in PHP is pretty fast actually, if you use the built-in xmlrpc methods. It supports multiple method calls in a single request as well, using system.multiCall. The native multiCall is broken up to PHP 5.3.2 though. I've written an easy to use library for xmlrpc, called Ripcord, which allows you to very easily create servers and clients and works around most of the bugs in PHP's native xmlrpc methods. See http://ripcord.googlecode.com/
Thanks for RipCord. I think it's a very good library. I just used it and could exec some rpc in just a hour or less.
Related
I'm including gettext library in my application. But, as our team decided to go with native php gettext. The gettext library accepts the string and converts it using the "Translate" function which is defined in the library. Now how can I shift suddenly to native lib? Is it only for performance? Any suggestions on using native lib. Thanks in advance.
It is mainly for performance and well as ease of use.
When you use an external library inside PHP using system() for e.g., then the pros are that you will be able to use ALL of its options, which will make you a power user. The cons are that, each time you run it, you have to do like a parsing of the return string and then figure out the results and stuff, which is a hassle and pretty error prone.
When you use a language-binding of the external library, then cons are that you are confined to the API calls that the binding provides. The pros are that the return values, the error status etc will be well defined within the API calls so handling the calls will be easier.
This is usually a tradeoff and it will vary from case to case as to whether one should use native interfaces or just execute the library directly.
I' developing a Soap Server using php nusoap library, however , I don't need to use dynamic generated WSDL file feature that's generated by nusoap, I just want to tell nuSoap to use the Fixed WSDL file that's written manully by an another team.
What do you suggest ?
My suggestion - give up. I'm not entirely sure that you can do what you want.
As you know, NuSOAP creates the WSDL on the fly only from the functions that you specify and then returns the resultant WSDL when requested.
If you could use an external fixed WSDL, what would happen if it is changed later on without your knowledge? A call to a SOAP method which is not handled by one of your functions could provide unknown results and would need to be handled by the calling machine in a nice, non-customer impacting way. Conversely, if you provide a new functionality but the other team won't adjust the WSDL for you, what do you do? Try and shoe-horn it into some other function?
Trying to match your functions to a pre-defined WSDL without errors would far out-strip any benefit you could get.
Stick with the "on-the-fly" generation for consistency and lack of headaches. Use the pre-generated WSDL as a reference but don't bother investigating whether you can use it
Also, I agree with #chrfin. If they are available on your server, consider using the native PHP SOAP functions - they are noticeably faster than NuSOAP as they are compiled rather than interpreted. The only reason I used NuSOAP in the first place was that (about 5 years ago) the native SOAP had problems communicating with a provider I needed (incorrect variable types etc). Now though, I will be re-factoring all of my code to native PHP SOAP
For a PHP project i have to access RESTful API. I was using curl to get familiar with the API. I can access the said API using both PHP's cUrl library and invoking the curl utility using PHP's shell_exec() function. Performance wise, which option would be better and why??
PS: I have my own server with root privilege.
My cautious guess would be not too useful test snippets shows that the curl library is more performant.
Edit: A little test shows, that the library is faster, but not by much. Also, if you fetch millions of URLs, network latency will more likely be a bigger problem.
Performance is pretty much exactly the same, because the same stuff is being executed internally. But you should use the API because it is cleaner.
It's evident that the cURL functions are very widely used. But why is that? Is it really only because the extension is mostly enabled per default?
While I can certainly relate to not introducing 3rd party libraries over builtins (DOMDocument vs phpQuery), using curl appears somewhat odd to me. There are heaps of HTTP libraries like Zend_Http or PEAR Http_Request. And despite my disdain for needless object-oriented interfaces, the pull-parameter-procedural API of curl strikes me as less legible in comparison.
There is of course a reason for that. But I'm wondering if most PHP developers realize what else libcurl can actually be used for, and that it's not just a HTTP library?
Do you have examples or actual code which utilizes cURL for <any other things> it was made for?
Or if you just use it for HTTP, what are the reasons. Why are real PHP HTTP libraries seemingly avoided nowadays?
I think this would be related to why do people use the mysql functions instead of mysqli (more object oriented interface) or take a step further and use a data abstraction layer or PDOs.
HTTP_Request2 says that there is a cURL adapter available to wrap around PHP's cURL functions.
Personally a lot of the PEAR extensions I have tried out, I haven't been that impressed with (and I feel less confident with PEAR libraries that are sitting in alpha that haven't been updated in a long time). Whereas the HTTP_Request2 Library does look quite nice
I for one would have used cURL without thinking of looking at a possible PEAR library to use. So thanks for raising my awareness.
The libraries you mentioned aren't default, and from my experience in PHP, I prefer to use less of such libraries; they enable a broader attack surface, decrease reliability, open to future modification/deprecation more than PHP itself.
Then, there's the sockets functionality which, although I've used some times, I prefer to rely on a higher level approach whenever possible.
What have I used CURL for?
As some may know, I'm currently working on a PHP framework. The communication core extension (appropriately called "connect") makes use of CURL as it's base.
I've used it widely, from extracting favicons form websites (together with parser utilities and stuff) to standard API calls over HTTP as well as the FTP layer when PHP's FTP is disabled (through stream wrappers) - and we all know native PHP FTP ain't that reliable.
Functional reasons as mentioned in the comments:
It's very old, [widely used and] well tested code, works reliably
is usually enabled by default
allows very fine grained control over the details of the request.
This might need expanding. By nature of the common-denominator protocol API cURL might provide features that plain HTTP libraries in PHP can't...
Historic reasons:
curl used to be the only thing that could handle cookies, POST, file uploads...
A lot of curl use probably comes from tutorials that pre-date PHP 5.
Which PHP RPC (XML or JSON) library have you successfully used?
I have done some research but haven't been able to find one library that stands out from the others.
I've found the following:
XML-RPC for PHP
XML_RPC (PEAR)
JSON-RPC PHP
And a few others that either don't look very active or mature.
I've used XML_RPC (PEAR) successfully. I'm not personally a fan of the xml-rpc "way" but the library was simple to use and we have a few dozen clients using it daily and pulling a fair amount of data over the wire and we've never had any problems.
We aren't pushing the envelope with this at all, in any way, but i'm very happy with the library since I don't even think about it anymore. The library isn't elegant or anything but neither is php or rpc, right?
I've written a simple XML-RPC library for PHP 5, called Ripcord. You can download it at http://ripcord.googlecode.com/. It is as easy to use as I could make it, even for the more advanced features like system.multicall. Give it a try.
we've done a very RPC robust library, following with the RPC specs 100%. You can switch between JSON-RPC-2.0 and JSONP on the fly and it supports also batch requests, signed request and service method introspection basing on Dojo's RPC proposals.
XApp-RPC core package https://github.com/XApp-Studio/xapp-Rpc
Example usage : xappcommander.com
we've done this because all what was around didn't meet our license/quality requirements. Have fun.