I am running an xml-rpc server to share my site's api methods.
I want to log all the requests to find out what sites are actually making requests to my xml-rpc server.
Is there any way to do this?
All I've tried is looking into $_SERVER and $request variables - nothing inside them.
Thanks.
CI v 2.1.3 in-box xmlrpcs library
Related
I have a php website using symfony2 framework .I wanted to know what would be the best method to trace incoming request on production server in order to troubleshoot client issues .The only way that i know of is using php log file in C:Windows/Temp directory .Is there any other way of tracing request and troubleshooting error with respect to such request .all the request to website are https not sure fiddler can help me in this scenario .Please let me know your feedback on the same .
Thanks
There are more than on way to log client requests.
First, Apache will log incoming requests in the /var/log/www/access.log
Secondly, Symfony has its own logging enabled by default available in app/logs/prod.log
Thirdly, you can implement client side logging and logging with monolog , example : Javascript errors => send ajax request to log endpoint => log with monolog action
Now, on what type of OS will your site run ? You speak about Windows Temp directory, will your site be hosted on Windows or UNIX servers ? configurations and available tools will then be different.
I am currently working on some project where I plan to do it via GrayLog2 (as #Christophe suggested in a 3rd solution).
I ran some local trials and it seems more than capable of logging just anything.
Good day!
I have a PHP based web application, that I am looking to add a Wordpress to.
The main application is in the root folder, and wordpress is installed in /wp
In order to get WP content into my application, I am using a JSON API (http://wordpress.org/plugins/json-api/)
Then, from the application, I am calling the API with CURL.
Is this a good way to go about calling the API? CURL seems to be very slow, and I think it has something to do with sessions, and the fact that it is requesting a url on the same domain.
Or perhaps someone could offer a suggestion on a better way to go about getting wordpress content into a non-wordpress based application.
Thanks for the help!
I am running a server which runs main website e.g. http://www.mywebsite.com and another server which holds all APIs lets say. http://api.mywebsite.com. Both of these are built using different technologies.
What I currently do is make a cURL calls to access data from APIs from api.mywebsite.com on www.mywebsite.com but its going very heavy on page response times on www.mywebsite.com.
So I am planning for an alternative a library or something which can help to make similar calls but with lesser resource consumption.
PS: I make GET/POST/PUT/DELETE requests to server so can't use something that only provides GET.
Why don't you access the data directly off the database?
Have you tried caching with MemCache or Redis?
I've been developing a site for a client that uses the Tweetr API to post messages to a user's Twitter account. It has now been discovered that the destination server is running IIS, so no mod_rewrite for the Tweetr proxy. Also cURL is not available, which prevents the proxy from forwarding requests.
The lack of mod_rewrite can be got around, but I need an alternative to using cURL that doesn't require anything not included in a very basic PHP 5.2 install. I'm considering trying to recreate the proxy functionality with jQuery AJAX, but at the same time that seems like a very lengthy solution. Can anyone point me in the right direction here?
I think that this might be what you're looking for.
I have already heard about the curl library, and that I get interest about...
and as i read that there are many uses for it, can you provide me with some
Are there any security problems with it?
one of the many useful features of curl is to interact with web pages, which means that you can send and receive http request and manipulate the data. which means you can login to web sites and actually send commands as if you where interacting from your web browser.
i found a very good web page titled 10 awesome things to do with curl. it's at http://www.catswhocode.com/blog/10-awesome-things-to-do-with-curl
One of it's big use cases is for automating activities such as getting content from another websites by the application. It can also be used to post data to another website and download files via FTP or HTTP. In other words it allows your application or script to act as a user accessing a website as they would do browsing manually.
There are no inherent security problems with it but it should be used appropriately, e.g. use https where required.
cURL Features
It's for spamming comment forms. ;)
cURL is great for working with APIs, especially when you need to POST data. I've heard that it's quicker to use file_get_contents() for basic GET requests (e.g. grabbing an RSS feed that doesn't require authentication), but I haven't tried myself.
If you're using it in a publicly distributed script, such as a WordPress plugin, be sure to check for it with function_exists('curl_open'), as some hosts don't install it...
In addition to the uses suggested in the other answers, I find it quite useful for testing web-service calls. Especially on *nix servers where I can't install other tools and want to test the connection to a 3rd party webservice (ensuring network connectivity / firewall rules etc.) in advance of installing the actual application that will be communicating with the web-services. That way if there are problems, the usual response of 'something must be wrong with your application' can be avoided and I can focus on diagnosing the network / other issues that are preventing the connection from being made.
It certainly can simplify simple programs you need to write that require higher level protocols for communication.
I do recall a contractor, however, attempting to use it with a high load Apache web server module and it was simply too heavy-weight for that particular application.