I'm trying to understand why this script doesn't work, when I execute script simultaneously in same browser or browser tab, second script do not see the created file "/tmp/monkey.tmp" (php7.4-fpm + nginx, default config, opcache enabled)
As soon as I used two different browsers, it works like expected, if I execute same script/URL simultaneously and one script with random data for example URL?_=monkey, it works like expected, problem is same URL in same browser, I dont understand why
$tmpfile = '/tmp/monkey.tmp';
clearstatcache();
if(file_exists($tmpfile))
{
die('file exist');
}
else
{
file_put_contents($tmpfile, 'blabla');
}
sleep(20);
exit;
Php can be tricky, when debugging with browsers because they most likely cache the php page, tho that is not the wanted bahaviour.
My guess is the browser or webserver cached the site and that is why you don't see the change.
But on the other Browser that didn't cache does.
That also explains why you see the change in the same browser when cahnign some part of the url, because the browser treats it as another page and can not use it's cache and goes to the server.
To debug u can try the following:
Run script on one Browser and see whether the file got created by hand
If thats the case and after refreshing you still don't get file exists -> Browser Cacheing
Try to install a plugin to clear the cache.
To stop this you can try and disable caching for this site in your browsers devtools.
Or set matadata cache time in the html to never cache the site.
Or run the request via Js that then prints the result to the page.
If after the cache clear you still don't see the page exists.
It maybe the webserver caching the response.
Tho thats unlikely...
First of all. I want to take a few words to explain that I am fully aware of what Cross-domain problem is and how to deal with it (in plain js with jQuery, but not in vue)
Here is the case:
I want to get the WFS features (as xml) from some of the geoserver (other domains) using the HTTP GET request. It is clear that it will be blocked because of the same origin policy.
I used to use a very simple proxy.php file to go around this and it works pretty well.
The proxy.php file is like this:
<?php
$nix="";
$type=$_GET['requrl'];
if ($_GET['requrl'] != $nix) {
$file = file_get_contents($_GET['requrl']);
} else {
$file = "false type";
}
echo $file;
?>
So basically I write in JS an Ajax-call with jQuery. which looks like this:
jQuery.ajax(
type: "GET",
data: {
requrl: "www.example.com/WFS?service=wfs&request=GetCapabilities"
},
url: "proxy.php"
).done(function (response) {
// handle the response text/xml
console.log(response);
})
The old way works well, I send the "true" url as requrl to the php file, the php get what I want for me and return it as response. So I can later hanlde the response with the jQuery-ajax.
The real problem:
But now I am moving my whole app to the vue.js framework. So I am now using vue-resource instead of the jQuery-ajax.
The vue-resource is not hard to understand. So I make the HTTP GET request as below:
this.$http.get('/static/proxy.php', {params: {requrl:"www.google.de"}}).then(response => {
// success
console.log("oh! success!");
}, response => {
// error
console.log("oh! error!");
});
I placed the proxy.php file in the public/static folder and the vue-resource keeps getting the content of the proxy.php for me. But not go through it and retrun me the response.
I have checked the HTTP GET request with the firefox dev tools and it shows me that the GET request is 200 OK. But the response is always the content of that proxy.php. It seems like the php file is not doing the work that I expected it to do.
And here is the response I got from the vue-resource GET request:
<?php $nix=""; $type=$_GET['requrl']; if ($_GET['requrl'] != $nix) { $file = file_get_contents($_GET['requrl']); } else { $file = "false type"; } echo $file; ?>
I am kind of knowing that the dev server could be the reason because in old days, I have php installed in my apache local server, and now with vue.js. I just type npm run serve each time I want to start a dev server. I don't know what kind of dev server I just started and if it works with php.
So I would like to ask how u guys deal with vue-resource and php. Or maybe there is a better way in vue.js to go around the cross domain problem?
Here is the dev environment I am using now:
The project is created with vue.js and vue/cli 3 (contained webpack and etc.)
The plug-in I use is vuetify and vue-resource
For those who maybe search for the same question in the future. I have solved my question this way:
Set up an apache server, by which php is installed (I got the content of my proxy.php because of that I didn't have php installed in the dev server, which is started by the command npm run serve, and that was why it didn't work!)
Make sure that you enable the CORS on your apache server! Because this apache server will run at an different port (for example 8888) and your dev server for your vue app will run for example at 8080 as default! And different ports will be considered also as Cross Domain! So make sure to enable the CORS on your apache server!
Point your HTTP GET Request to your proxy.php file in your apache server with vue-resource, here an example in vue app (my apache server is running at port 8888, the proxy.php file is also showed in this question, here I got my own ID with the GET Request to the url http://httpbin.org/ip):
this.$http.get('http://localhost:8888/proxy.php', {params: {requrl: "http://httpbin.org/ip"}}).then(response => {
// success
console.log("oh! success!");
console.log("success response: ", response);
}, response => {
// error
console.log("oh! error!")
console.log("error response: ", response);
});
Now you successfully go around the Cross Domain!
I placed the proxy.php file in the public/static folder and the vue-resource keeps getting the content of the proxy.php for me. But not go through it and return me the response.
In order to run your php files you will need to configure a local server to serve and execute php files because your npm run serve command is serving static files only. (Javascript, html, css and etc)
I suggest you to install a Xampp to easily configure a PHP development environment.
So you will get a local server for your php environment and another for your vue app running in different ports.
I have an issue with Magento Custom Cache.
I have Observer method which launches by cron, i write value to the cache:
Mage::app()->saveCache($visitorsCount, 'cached_google_analytics_visitors_count', [], $twoDaysInSeconds);
Value is successfuly saved and i'm able to extract it from cache here. And files
mage---4ae_CACHED_GOOGLE_ANALYTICS_VISITORS_COUNT
and
mage---internal-metadatas---4ae_CACHED_GOOGLE_ANALYTICS_VISITORS_COUNT
are here two.
Now it's time to extract value from cache in my block, so i do this way:
$visitorsCount = Mage::app()->loadCache('cached_google_analytics_visitors_count');
But it returns me false. I've investigated that the reason is that there is no CACHED_GOOGLE_ANALYTICS_VISITORS_COUNT in metadatasArray in Zend_Cache_Backend_File class, but the file of metadatas exists.
More then, metadatasArray has this value when i'm writting value to the cache.
Hope your help.
Regards, Nikolay
i've got the reason of error:
cron was running from another user than web-server, so php-proccess didn't have permissions to read the file with metadatas. I've launched cron from www-data user and it works correctly now
I'm developing a page that pulls images from Flickr and Panoramio via jQuery's AJAX support.
The Flickr side is working fine, but when I try to $.get(url, callback) from Panoramio, I see an error in Chrome's console:
XMLHttpRequest cannot load http://www.panoramio.com/wapi/data/get_photos?v=1&key=dummykey&tag=test&offset=0&length=20&callback=processImages&minx=-30&miny=0&maxx=0&maxy=150. Origin null is not allowed by Access-Control-Allow-Origin.
If I query that URL from a browser directly it works fine. What is going on, and can I get around this? Am I composing my query incorrectly, or is this something that Panoramio does to hinder what I'm trying to do?
Google didn't turn up any useful matches on the error message.
EDIT
Here's some sample code that shows the problem:
$().ready(function () {
var url = 'http://www.panoramio.com/wapi/data/get_photos?v=1&key=dummykey&tag=test&offset=0&length=20&callback=processImages&minx=-30&miny=0&maxx=0&maxy=150';
$.get(url, function (jsonp) {
var processImages = function (data) {
alert('ok');
};
eval(jsonp);
});
});
You can run the example online.
EDIT 2
Thanks to Darin for his help with this. THE ABOVE CODE IS WRONG. Use this instead:
$().ready(function () {
var url = 'http://www.panoramio.com/wapi/data/get_photos?v=1&key=dummykey&tag=test&offset=0&length=20&minx=-30&miny=0&maxx=0&maxy=150&callback=?';
$.get(url, function (data) {
// can use 'data' in here...
});
});
For the record, as far as I can tell, you had two problems:
You weren't passing a "jsonp" type specifier to your $.get, so it was using an ordinary XMLHttpRequest. However, your browser supported CORS (Cross-Origin Resource Sharing) to allow cross-domain XMLHttpRequest if the server OKed it. That's where the Access-Control-Allow-Origin header came in.
I believe you mentioned you were running it from a file:// URL. There are two ways for CORS headers to signal that a cross-domain XHR is OK. One is to send Access-Control-Allow-Origin: * (which, if you were reaching Flickr via $.get, they must have been doing) while the other was to echo back the contents of the Origin header. However, file:// URLs produce a null Origin which can't be authorized via echo-back.
The first was solved in a roundabout way by Darin's suggestion to use $.getJSON. It does a little magic to change the request type from its default of "json" to "jsonp" if it sees the substring callback=? in the URL.
That solved the second by no longer trying to perform a CORS request from a file:// URL.
To clarify for other people, here are the simple troubleshooting instructions:
If you're trying to use JSONP, make sure one of the following is the case:
You're using $.get and set dataType to jsonp.
You're using $.getJSON and included callback=? in the URL.
If you're trying to do a cross-domain XMLHttpRequest via CORS...
Make sure you're testing via http://. Scripts running via file:// have limited support for CORS.
Make sure the browser actually supports CORS. (Opera and Internet Explorer are late to the party)
You need to maybe add a HEADER in your called script, here is what I had to do in PHP:
header('Access-Control-Allow-Origin: *');
More details in Cross domain AJAX ou services WEB (in French).
For a simple HTML project:
Python 2
cd project
python -m SimpleHTTPServer 8000
Python 3
cd project
python -m http.server 8000
Then browse your file.
Works for me on Google Chrome v5.0.375.127 (I get the alert):
$.get('http://www.panoramio.com/wapi/data/get_photos?v=1&key=dummykey&tag=test&offset=0&length=20&callback=?&minx=-30&miny=0&maxx=0&maxy=150',
function(json) {
alert(json.photos[1].photoUrl);
});
Also I would recommend you using the $.getJSON() method instead as the previous doesn't work on IE8 (at least on my machine):
$.getJSON('http://www.panoramio.com/wapi/data/get_photos?v=1&key=dummykey&tag=test&offset=0&length=20&callback=?&minx=-30&miny=0&maxx=0&maxy=150',
function(json) {
alert(json.photos[1].photoUrl);
});
You may try it online from here.
UPDATE:
Now that you have shown your code I can see the problem with it. You are having both an anonymous function and inline function but both will be called processImages. That's how jQuery's JSONP support works. Notice how I am defining the callback=? so that you can use an anonymous function. You may read more about it in the documentation.
Another remark is that you shouldn't call eval. The parameter passed to your anonymous function will already be parsed into JSON by jQuery.
As long as the requested server supports the JSON data format, use the JSONP (JSON Padding) interface. It allows you to make external domain requests without proxy servers or fancy header stuff.
If you are doing local testing or calling the file from something like file:// then you need to disable browser security.
On MAC:
open -a Google\ Chrome --args --disable-web-security
It's the same origin policy, you have to use a JSON-P interface or a proxy running on the same host.
We managed it via the http.conf file (edited and then restarted the HTTP service):
<Directory "/home/the directory_where_your_serverside_pages_is">
Header set Access-Control-Allow-Origin "*"
AllowOverride all
Order allow,deny
Allow from all
</Directory>
In the Header set Access-Control-Allow-Origin "*", you can put a precise URL.
In my case, same code worked fine on Firefox, but not on Google Chrome. Google Chrome's JavaScript console said:
XMLHttpRequest cannot load http://www.xyz.com/getZipInfo.php?zip=11234.
Origin http://xyz.com is not allowed by Access-Control-Allow-Origin.
Refused to get unsafe header "X-JSON"
I had to drop the www part of the Ajax URL for it to match correctly with the origin URL and it worked fine then.
As final note the Mozilla documentation explicitly says that
The above example would fail if the header was wildcarded as:
Access-Control-Allow-Origin: *. Since the Access-Control-Allow-Origin explicitly mentions http://foo.example,
the credential-cognizant content is returned to the invoking web
content.
As consequence is a not simply a bad practice to use '*'. Simply does not work :)
Not all servers support jsonp. It requires the server to set the callback function in it's results. I use this to get json responses from sites that return pure json but don't support jsonp:
function AjaxFeed(){
return $.ajax({
url: 'http://somesite.com/somejsonfile.php',
data: {something: true},
dataType: 'jsonp',
/* Very important */
contentType: 'application/json',
});
}
function GetData() {
AjaxFeed()
/* Everything worked okay. Hooray */
.done(function(data){
return data;
})
/* Okay jQuery is stupid manually fix things */
.fail(function(jqXHR) {
/* Build HTML and update */
var data = jQuery.parseJSON(jqXHR.responseText);
return data;
});
}
I use Apache server, so I've used mod_proxy module. Enable modules:
LoadModule proxy_module modules/mod_proxy.so
LoadModule proxy_http_module modules/mod_proxy_http.so
Then add:
ProxyPass /your-proxy-url/ http://service-url:serviceport/
Finally, pass proxy-url to your script.
For PHP - this Work for me on Chrome, safari and firefox
https://w3c.github.io/webappsec-cors-for-developers/#avoid-returning-access-control-allow-origin-null
header('Access-Control-Allow-Origin: null');
using axios call php live services with file://
I also got the same error in Chrome (I didn't test other browers). It was due to the fact that I was navigating on domain.com instead of www.domain.com. A bit strange, but I could solve the problem by adding the following lines to .htaccess. It redirects domain.com to www.domain.com and the problem was solved. I am a lazy web visitor so I almost never type the www but apparently in some cases it is required.
RewriteEngine on
RewriteCond %{HTTP_HOST} ^domain\.com$ [NC]
RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]
Make sure you are using the latest version of JQuery. We were facing this error for JQuery 1.10.2 and the error got resolved after using JQuery 1.11.1
Folks,
I ran into a similar issue. But using Fiddler, I was able to get at the issue. The problem is that the client URL that is configured in the CORS implementation on the Web API side must not have a trailing forward-slash. After submitting your request via Google Chrome and inspect the TextView tab of the Headers section of Fiddler, the error message states something like this:
*"The specified policy origin your_client_url:/' is invalid. It cannot end with a forward slash."
This is real quirky because it worked without any issues on Internet Explorer, but gave me a headache when testing using Google Chrome.
I removed the forward-slash in the CORS code and recompiled the Web API, and now the API is accessible via Chrome and Internet Explorer without any issues. Please give this a shot.
Thanks,
Andy
There is a small problem in the solution posted by CodeGroover above , where if you change a file, you'll have to restart the server to actually use the updated file (at least, in my case).
So searching a bit, I found this one To use:
sudo npm -g install simple-http-server # to install
nserver # to use
And then it will serve at http://localhost:8000.
I have no issue debugging php files independently but when I want to see the request that server
side(php) get from client side locally, I can't. I try to do it by putting breaking point inside php file, hopefully debugger will stop on break point when I debugging my project using chrome.
My php looks like that:
<?php
$response = "Super" <--this line has a breaking point
echo $response
Client side sending the request to server side looks like that:
function ajaxRequest(url, data, requestMethod)
{
$.ajax({
type: requestMethod,
url: url,
data: {
json: data
},
success: responseHandler
});
}
When I run project in debug I get window in chrome with this url:
http://localhost/Jason/index.html?XDEBUG_SESSION_START=19067
And in my PHPStorm debugger I see waiting for connection with ide key 19067
The chrome is displaying to code as if the request already been sent and response has been received without stopping in php break point.
After you start php debugging, try to right click in the browser window and select Inspect in PhpStorm. This should also activate the JS debugger in storm alongside php debug.
Of course you've installed Chrome extension for PhpStorm:
https://chrome.google.com/webstore/detail/jetbrains-ide-support/hmhgeddbohgjknpmjagkdomcpobmllji
Hope this helps.
[Later edit]
Ah, and deactivate any JavaScript minifying you may have!