Im developing a codeigniter based ipn handler script for my shopping app. It seems that the Paypal sandbox uses cached versions of my response script. I get an Email with the post-values everytime i send an ipn test. I changed the email template like 2 hours ago but the ipn script sends the emails with the old layout.
Thant makes debugging my ipn Variables a pretty bad mess. I tried setting the header-cache-control to "must-revalidate" but the results appear the same.
It is just like paypal stores a proxied version of my file and uses it over and over again.
Do you have any ideas about this issue?
If I had to bet, I would bet against this being a caching issue. PHP scripts usually don't emit any caching headers (but of course, do make sure to check e.g. using Firebug), and the purpose of the whole thing would be defeated if PayPal actually listened to such caching instructions.
I would triple- and quadruple-check the URL that PayPal calls to see whether there is a second version of the script hanging around that doesn't get updated - maybe a case of Index.php vs. index.php or something? That often is the reason.
The only caching culprit I can think of is a reverse proxy on your web server's end. But you're not mentioning having one, so I'm assuming there is none.
Related
I have a website that it consuming it's own API. This means sending a request to itself.
I feel like there might be a way to send a request to a local page quicker than just including the API url.
Right now I'm doing: file_get_contents("http://domain.com/api/recent")
These didn't work when I tried it:
file_get_contents("http://localhost/api/recent")
file_get_contents("http://127.0.0.1/api/recent")
Sorry I can't comment since don't have enough reputation - my 5 cents -
You could easily use the local API by including / requiring the API php file while setting up all the posted / get variables prior to the inclusion. You could cache them if the user sent you important data and then re-set them from cache.
When you call the API with http it is basically slower as it goes through the web server and not through the PHP engine. cheers.
Im making a simple web-based control panel, and i figured the easiest way for me to accomplish it would be to have PHP on the 2 machines (one being the web facing machine, the other being behind a VPN), basically I need it so when I press a button on the site on the externally facing IP of machine 1, it sends a request to the internally facing IP (eg 192.168.100.1) of machine 2 and runs the PHP file (test.php plus some $_GET data) without actually redirecting the end user to 192.168.100.1, because obviously that will time out as there is no access to it.
If all you want is to make certain internal PHP pages accessible on the external server, you should consider setting up a reverse proxy instead of manually proxying requests with PHP.
See the Apache documentation for an example: http://httpd.apache.org/docs/2.2/mod/mod_proxy.html
Of course this won't work if you do your authentication on the external server and/or need to execute additional PHP code on the external server before/after the internal PHP code. In that case refer to Mihai's or Louis's answer.
You can use cURL to send or forward HTTP requests from machine 1 to machine 2 and to receive the responses machine 2 gives you and (if needed) process those responses to show them to the user.
You could also use (XML-/JSON-)RPC or SOAP which would be a bit more elegant and extensible (more commonplace than using cURL) but it would have a higher learning curve with a bigger setup time/work.
You should also be able to use file_get_contents (normally supporting the http protocol) or http_get, a function designed for simple http get requests.
It might not be the most ideal way, but should be fairly easy to do.
Disclaimer: May be a insane question but I have suffered a lot so came here.
I am working on a legacy application which uses JS + PHP + Web services (Written in spring).
Flow of the application :
Whenever any web service is called from JS it is redirected to one php file. The php file authenticates the user(using one web service) and then forwards the request to actual web service.
How can I debug this application ? I have debugged JS using Firebug and servr side code using Eclipse but never debugged such a application.
~Ajinkya.
I think there are a variety of things that need to be done, and I must say this question is sufficiently general as to not have a straight answer so I will do my best. As xdazz mentioned, var_dump (and die) are necessary from the PHP standpoint.
Whenever anything is returned to JS console.log it. In addition, ensure XHTTP requests are turned on for Firebug or alternatively view the output of each request in the Chrome Network tab.
With a combination of console.log, var_dump, and die, you can trace non-functioning parts of the application repeatedly step by step until you come across the bug.
Alternatively, and in the long run you ought to be doing this anyway, build error handling code into all the PHP code that is only activated when a debug flag is set to true. This way you can get detailed error messages and then when you deploy, you can turn them off to avoid compromising security.
If you are needing to inspect the entire lifecycle of a Web service request in your scenario you will need to combine a several techniques. Considering the fact that the scope of your scenario spans from client to server you will need to decide with what you will persist the information you need to inspect.
Personally, I would choose the path of least resistance which in my case would probably be cookies. With that being said you should be able chronologically log the necessary information via JavaScript and PHP, both before, during and after the request and even redirect has occurred.
This strategy would then allow for the information logged with cookies to then be dumped or analyzed via JavaScript, WebKit inspector or Firebug. Again, this is probably how I would handle such a scenario. Lastly, you can apply different storage strategies to this technique such as using a session or database for persistence.
Note: You can use something like WebKit Inspector, and possibly Firebug, to analyze data transmitted and received for GET, POST and even WebSocket requests.
I have a hobby wesbite written in PHP and I like to know if there is a problem with it (database errors, an update broke something, etc.) I have a simple notification system which sends me an email if there is a problem and that would be enough for me. Unfortunately, the mail sending feature of the hosting provider is not very reliable. Usually it works, but there are periods when it simply swallows the mails and doesn't send anything.
Is there some other reliable method for notification of the maintainer in case of an error? It's a hobby site, so I'm looking for something simple. Not an industrial strength solution, but something more reliable than email. How do you monitor your hobby sites?
I tagged the question with PHP, because the site is written in it, but I'm also interested in generic suggestions, not just in concrete PHP solutions.
EDIT: the question is about the mechanism of active notification. I want to be notified when something happens. If PHP email is not reliable then what are the other possibilites of notification?
EDIT2: two examples to illustrate what kind of solutions I'm thinking of:
Store the errors and provide a page listing the latest errors (maybe password protected) which would be polled from my computer which could pop up some window if there is an error. It can work, but it works only if I'm at my home computer.
Use google calendar api to insert an event into it when an error occurs, and google calendar will send me an email reliably. It may work, though it's cumbersome.
some other idea?
Are you looking only for email based alerting systems? If not, you should try Notifo. You can use their API to push notifications and it'll be instantly sent to your phone.
PHP has an error_log function for returning errors in various ways, either via email to an admin, to the servers log file or to an external file. I assume that you could merely substitute this functionality for your mailto when you find an error:
http://php.net/manual/en/function.error-log.php
I've run into the issues you've mentioned with my hobby project as well. When I started I was using GoDaddy who's mail relay was pretty unreliable for delivering mail in a timely fashion.
Two things I'd suggest:
For sending email messages with higher reliablity, check out Postmark. Its a paid solution, but the rates are pretty reasonable and it comes with PHP classes you can hook your code up to fairly easily.
For custom error handling, check out PHP's set_error_handler(). Its a good way to have custom code execute on error conditions on your site. From the documentation:
set_error_handler — Sets a user-defined error handler function.
This function can be used for defining your own way of handling errors during runtime, for example in applications in which you need to do cleanup of data/files when a critical error happens, or when you need to trigger an error under certain conditions (using trigger_error()).
Maybe give Airbrake (formerly Hoptoad) a try. This is a commercial service, but they have a basic free plan (tiny little link at the bottom of the pricing page), and the tool looks pretty cool. It's focused on Ruby on Rails but according to their site has plugins for various other frameworks and languages, inlcuding PHP.
http://airbrakeapp.com/pages/home
We have a system set-up that polls specific pages on our important websites every now and then and checks for certain strings. Would something like that be viable to you?
My repository provider (Beanstalk), enables me, in the deployment process, to trigger a "web hook" after each deployment. I.e., after each deployment, it'll send a post request to a URL provided by me, with a JSON encoded information about the deployment.
If i wasn't clear enough, this http://help.beanstalkapp.com/kb/deployments/post-release-web-hooks-for-deployments is a short and clear explanation provided by my repository provider.
I'm trying to write the script that will process this request. Actually, I'm not interested in the provided information about the deployment. All I need is to know that a deployment were done, and to perform some activity on my server (for example: to update the DB). And that what I did. I just wrote a script that, when triggered, updates the DB and generally does what it need to do on the server.
Now,
The repository provider refuses to receive the URL to my script as a processor of its hooks, because it says it's being responded with HTTP 400.
Why can this happen? How can I control the returned HTTP value? I've never had to take care of this during my normal programming.
When testing my hook-processor by directing a browser to it, I see that it's being successfully triggered, and does what it needs to do on the server side.
The whole project, including my hook-processor, is written in PHP + the Yii framework. The server is Apache.
I figure out, that my (complete?) lack of understanding of the HTTP protocol, is probably what creates the problem for me here.
Thank you
Gidi
EDIT:
Adding here the (trivial) code that handles the hook. It has only side effects, no output. (If I do add an output, like "echo 'done';" nothing changes
public function actionAfterDeployment ()
{
$rootPath = Yii::getPathOfAlias ('system').'/../..';
$console = $rootPath."/crm/console.php";
exec ("php $console onEachDeployment ".$_SERVER["SERVER_NAME"]);
}
as per gidireich's comment
My application was throwing an HTTP exception, due to a wrong CSRF
token - no token was supplied, and it's a post request, requiring in
our system such a token. Thanks for everybody participating