I have a website that it consuming it's own API. This means sending a request to itself.
I feel like there might be a way to send a request to a local page quicker than just including the API url.
Right now I'm doing: file_get_contents("http://domain.com/api/recent")
These didn't work when I tried it:
file_get_contents("http://localhost/api/recent")
file_get_contents("http://127.0.0.1/api/recent")
Sorry I can't comment since don't have enough reputation - my 5 cents -
You could easily use the local API by including / requiring the API php file while setting up all the posted / get variables prior to the inclusion. You could cache them if the user sent you important data and then re-set them from cache.
When you call the API with http it is basically slower as it goes through the web server and not through the PHP engine. cheers.
Related
I am using a script that fetch users infos from an api.
This is how it works I get the id of the user posted in php.
Then make a curl request to the api with the user id to get his infos.
The problem is that whenever the user refresh the page my script do the work again and fetch the infos to display them. But this api take much time to respond and it makes my script works slower.
Is there any solution to cache the api response for sometime (The server i request force cache : no cache in headers) ?
I am thinking about creating a directory with json file (the api response) for every user id . Or I can go with mysql solution .
What's the best solution for me ?
N.B : Im getting about 200 000 request/day.
Thanks
It sounds like you actually need to optimize this API, so that updates will be there as you expect them to be.
However, if you insist that caching is appropriate for your use, there's no need to reinvent the wheel. Simply put Nginx, Varnish, etc. in front of the API server. Then configure your application to request from this caching proxy rather than the API server directly.
See also:
Nginx Caching Configuration Documentation
Im making a simple web-based control panel, and i figured the easiest way for me to accomplish it would be to have PHP on the 2 machines (one being the web facing machine, the other being behind a VPN), basically I need it so when I press a button on the site on the externally facing IP of machine 1, it sends a request to the internally facing IP (eg 192.168.100.1) of machine 2 and runs the PHP file (test.php plus some $_GET data) without actually redirecting the end user to 192.168.100.1, because obviously that will time out as there is no access to it.
If all you want is to make certain internal PHP pages accessible on the external server, you should consider setting up a reverse proxy instead of manually proxying requests with PHP.
See the Apache documentation for an example: http://httpd.apache.org/docs/2.2/mod/mod_proxy.html
Of course this won't work if you do your authentication on the external server and/or need to execute additional PHP code on the external server before/after the internal PHP code. In that case refer to Mihai's or Louis's answer.
You can use cURL to send or forward HTTP requests from machine 1 to machine 2 and to receive the responses machine 2 gives you and (if needed) process those responses to show them to the user.
You could also use (XML-/JSON-)RPC or SOAP which would be a bit more elegant and extensible (more commonplace than using cURL) but it would have a higher learning curve with a bigger setup time/work.
You should also be able to use file_get_contents (normally supporting the http protocol) or http_get, a function designed for simple http get requests.
It might not be the most ideal way, but should be fairly easy to do.
I'm writing an API using php to wrap a website functionality and returning everything in json\xml. I've been using curl and so far it's working great.
The website has a standard file upload Post that accepts file(s) up to 1GB.
So the problem is how to redirect the file upload stream to the correspondent website?
I could download the file and after that upload it, but I'm limited by my server to just 20MG. And it seems a poor solution.
Is it even possible to control the stream and redirect it directly to the website?
I preserverd original at the bottom for posterity, but as it turns out, there is a way to do this
What you need to use is a combination of HTTP put method (which unfortunately isn't available in native browser forms), the PHP php://input wrapper, and a streaming php Socket. This gets around several limitations - PHP disallows php://input for post data, but it does nothing with regards to PUT filedata - clever!
If you're going to attempt this with apache, you're going to need mod_actions installed an activated. You're also going to need to specify a PUT script with the Script directive in your virtualhost/.htaccess.
http://blog.haraldkraft.de/2009/08/invalid-command-script-in-apache-configuration/
This allows put methods only for one url endpoint. This is the file that will open the socket and forward its data elsewhere. In my example case below, this is just index.php
I've prepared a boilerplate example of this using the python requests module as the client sending the put request with an image file. If you run the remote_server.py it will open a service that just listens on a port and awaits the forwarded message from php. put.py sends the actual put request to PHP. You're going to need to set the hosts put.py and index.php to the ones you define in your virtual host depending on your setup.
Running put.py will open the included image file, send it to your php virtual host, which will, in turn, open a socket and stream the received data to the python pseudo-service and print it to stdout in the terminal. Streaming PHP forwarder!
There's nothing stopping you from using any remote service that listens on a TCP port in the same way, in another language entirely. The client could be rewritten the same way, so long as it can send a PUT request.
The complete example is here:
https://github.com/DeaconDesperado/php-streamer
I actually had a lot of fun with this problem. Please let me know how it works and we can patch it together.
Begin original answer
There is no native way in php to pass a file asynchronously as it comes in with the request body without saving its state down to disc in some manner. This means you are hard bound by the memory limit on your server (20MB). The manner in which the $_FILES superglobal is initialized after the request is received depends upon this, as it will attempt to migrate that multipart data to a tmp directory.
Something similar can be acheived with the use of sockets, as this will circumvent the HTTP protocol at least, but if the file is passed in the HTTP request, php is still going to attempt to save it statefully in memory before it does anything at all with it. You'd have the tail end of the process set up with no practical way of getting that far.
There is the Stream library comes close, but still relies on reading the file out of memory on the server side - it's got to already be there.
What you are describing is a little bit outside of the HTTP protocol, especially since the request body is so large. HTTP is a request/response based mechanism, and one depends upon the other... it's very difficult to accomplish a in-place, streaming upload at an intermediary point since this would imply some protocol that uploads while the bits are streamed in.
One might argue this is more a limitation of HTTP than PHP, and since PHP is designed expressedly with the HTTP protocol in mind, you are moving about outside its comfort zone.
Deployments like this are regularly attempted with high success using other scripting languages (Twisted in Python for example, lot of people are getting onboard with NodeJS for its concurrent design patter, there are alternatives in Ruby or Java that I know much less about.)
I am currently working on a site. I designed it as an API so that it can easily interface with mobile devices as well as keep me completely separate from front end development. The intention was that the front end designer would use javascript\jQuery to make API calls. The API returns JSON so the front end designer would format the content appropriately. I noticed that instead of using jQuery to obtain this data he is using inline PHP to make the appropriate API calls using cURL to localhost and then echoing the JSON result and formatting it. Is this cause for concern since the server is essentially requesting itself. A new process is spawned, the server has to process a request AND response, etc. Is it better off for the remote clients to use jQuery to resolve the API calls or have the server cURL localhost and resolve them?
It sounds like the performance issue here would be that PHP might be getting loaded up more than it needs to:
JQuery -> RESTful API built on PHP
vs
JQUERY -> PHP cURL Call -> cURL -> RESTful API built on PHP
Each call takes an extra use of PHP, as you said spawning another process. The extra use of cURL is no biggie (it's lightweight), but the extra use of PHP might be an issue if you are going to have heavy usage (say 100 concurrent, but really depends on your server, and many other factors).
Im developing a codeigniter based ipn handler script for my shopping app. It seems that the Paypal sandbox uses cached versions of my response script. I get an Email with the post-values everytime i send an ipn test. I changed the email template like 2 hours ago but the ipn script sends the emails with the old layout.
Thant makes debugging my ipn Variables a pretty bad mess. I tried setting the header-cache-control to "must-revalidate" but the results appear the same.
It is just like paypal stores a proxied version of my file and uses it over and over again.
Do you have any ideas about this issue?
If I had to bet, I would bet against this being a caching issue. PHP scripts usually don't emit any caching headers (but of course, do make sure to check e.g. using Firebug), and the purpose of the whole thing would be defeated if PayPal actually listened to such caching instructions.
I would triple- and quadruple-check the URL that PayPal calls to see whether there is a second version of the script hanging around that doesn't get updated - maybe a case of Index.php vs. index.php or something? That often is the reason.
The only caching culprit I can think of is a reverse proxy on your web server's end. But you're not mentioning having one, so I'm assuming there is none.