I'm using jQuery-File-Upload with jQuery-Iframe-Transport to try to get support for older versions of IE.
I've set the forceIframeTransport option to true so that it behaves more or less the same way in all browsers, but I don't seem to get any data back on the server-side regardless of browser when it uses the iframe transport.
I've spat out the request headers server-side and I get back:
array(
Host => "*******"
Connection => "keep-alive"
Content-Length => "0"
Accept => "*/*"
Origin => "**************"
X-Requested-With => "XMLHttpRequest"
User-Agent => "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.0.1312.57 Safari/537.17"
DNT => "1"
Referer => "***********"
Accept-Encoding => "gzip,deflate,sdch"
Accept-Language => "en-GB,en-US;q=0.8,en;q=0.6"
Accept-Charset => "ISO-8859-1,utf-8;q=0.7,*;q=0.3"
Cookie => "*********"
)
[*****s indicated bleeped out info; you don't need that ;)]
Which look OK, but $_REQUEST is empty (i.e., array()), and the input buffer is empty too:
$handle = fopen('php://input', 'r');
$file_data = '';
while(($buffer = fgets($handle, 4096)) !== false) {
$file_data .= $buffer;
}
fclose($handle); // $file_data = '';
This all worked fine when I wasn't using the iframe-transport but I need IE support... does anyone have any experience with transmitting files using iframes and might know why no data is coming through?
When I use jQuery-File-Upload / js / jquery.iframe-transport.js and force iframe transport it works in Chrome, but the requests don't even make it to the server in IE.
When I use jquery-iframe-transport / jquery.iframe-transport.js and force iframe transport it breaks in Chrome, but that's fine because Chrome supports proper XHR file transfers, and the requests at least hit the server in IE but no data comes through.
I've updated my script to support either transfer method:
if(empty($_FILES)) {
$handle = fopen('php://input', 'r');
$file_data = '';
while(($buffer = fgets($handle, 4096)) !== false) {
$file_data .= $buffer;
}
fclose($handle);
} else {
$file_data = file_get_contents($_FILES['files']['tmp_name'][0]);
}
But again, I still can't seem to get any data in IE regardless of what I do.
When I say "IE", I'm specifically testing in IE 8 right now. I need support back to 7 though. This guy claims support all the way back to IE 6.
After many hours, I've finally tracked down the issue.
First, you need to use the transport plugin that comes bundled with jQuery-file-upload because it was made for it ;) I'm not quite sure why the other one got a step further, but I'll get to that in a minute.
I noticed in IE that I was getting an "access is denied" JavaScript error somewhere in the core jquery library. From what I read online this usually happens when you try to submit to a URL at a different domain, which I wasn't doing, so I dismissed it.
I was comparing what the 2 different transport scripts did differently, when I came to a line that said form.submit() in one version, and form[0].submit() in the other. So I tried adding the [0] and then noticed the "access has denied" error changed to point to that line. So clearly, it didn't like where I was submitting the files to.
I double checked the form.action and the URL still looked fine. Through some Google-fu I discovered that you can also get this error if the event does not originate from the original/native file input element.
I had replaced the native input with a fancy one and then triggered a "fake" 'click' event on the hidden native input. This it didn't like.
Took out my fake upload button and plopped the native one (<input type="file"/> fyi) back in, and now everything works like a charm in all browsers. Huzzah!
For what it's worth ...
I was working with jQuery v1.9.1 doing virus scanning on files synchronously before files are uploaded to the server. If the file had a virus, we WERE returning a HTTP 400, and HTTP 200 if not virus.
The HTTP 400 response caused the IE8 "Access Denied" result.
When I changed server reponse from 400 to 401, the UI worked perfectly.
Again, "For What It's Worth."
Related
I am generating leads via Facebook Lead Ads. My server accepts the RTU from Facebook and I am able to push the data around to my CRM as required for my needs.
I want to send an event to GA for when the form is filled out on Facebook.
Reading over the Google Measurement Protocol Reference it states:
user_agent_string – Is a formatted user agent string that is used to compute the following dimensions: browser, platform, and mobile capabilities.
If this value is not set, the data above will not be computed.
I believe that because I am trying to send the event via a PHP webhook script where no browser is involved, the request is failing.
Here is the relevant part of the code that I'm running (I changed from POST to GET thinking that might have been the issue, will change this back to POST once it's working):
$eventData = [
'v' => '1',
't' => 'event',
'tid' => 'UA-XXXXXXX-1',
'cid' => '98a6a970-141c-4a26-b6j2-d42a253de37e',
'ec' => 'my-category-here',
'ea' => 'my-action-here',
'ev' => 'my-value-here
];
//Base URL for API submission
$googleAnalyticsApiUrl = 'https://www.google-analytics.com/collect?';
//Add vars from $eventData object
foreach ($eventData as $key => $value) {
$googleAnalyticsApiUrl .= "$key=$value&";
}
//Remove last comma for clean URL
$googleAnalyticsApiUrl = substr($googleAnalyticsApiUrl, 0, -1);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $googleAnalyticsApiUrl);
curl_setopt($ch,CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($ch);
curl_close($ch);
I believe it is the user-agent that is causing the issue as if I manually put the same URL into the browser than I'm trying to hit, the event appears instantly within the Realtime tracking in GA.
An example of said URL is:
https://www.google-analytics.com/collect?v=1&t=event&tid=UA-XXXXX-1&cid=98a6a970-141c-4a26-b6j2-d42a253de37e&ec=my-category-here&ea=my-action-here&el=my-value-here
I have used both the live endpoint and the /debug/ endpoint. My code will not submit without error to either, yet if I visit the relevant URLs via browser, the debug endpoint says all is ok and then on the live endpoint the event reaches GA as expected.
I'm aware that curl_setopt($ch,CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']); is trying to send the user-agent of the browser, I have tried filling this option with things such as
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.87 Safari/537.36"
but it never gets accepted by the Measurement Protocol.
My Questions
Is it possible for me to send these events to GA without a web browser being used in the process? I used to have Zapier push these events for me, so I assume it is possible.
How do I send a valid user_agent_string via PHP? I have tried spoofing it with 'CURLOPT_USERAGENT', but never manage to get them working.
I had the same problem: fetching the collect URL from my browser worked like a charm (I saw the hit in the Realtime view), but fetching with with curl or wget did not. On the terminal, using httpie also wored.
I sent a user agent header with curl, and that did solve the issue.
So I am bit puzzled by #daveidivide last comment and that his initial hypothesis was wrong (I mean, I understand that he might have had 2 problems, but sending the user-agent header seems mandatory).
In my experience, Google Analytics simply refrains from tracking requests from cURL or wget (possibly others)... perhaps in an attempt to filter out unwanted noise...? 🤷🏼♂️
Any request with a User-Agent including the string "curl" won't get tracked. Overriding the User-Agent header to pretty much anything else, GA will track it.
If you neglect to override the User-Agent header when using cURL, it'll include a default header identifying itself... and GA will ignore the request.
This is also the case when using a package like Guzzle, which also includes its own default User-Agent string (e.g. "GuzzleHttp/6.5.5 curl/7.65.1 PHP/7.3.9").
As long as you provide your own custom User-Agent header, GA should pick it up.
I am running an IIS 8 / PHP web server and am attempting to write a so-called 'proxy script' as a means of fetching HTTP content and loading it onto an HTTPS page.
Although the script does run successfully (outputting whatever the HTTP page sends) in some cases - for example, Google.com, Amazon.com, etc. - it does not work in fetching my own website and a few others.
Here is the code of proxy.php:
<?php
$url = $_GET['url'];
echo "FETCHING URL<br/>"; // displays this no matter what URL I enter
$ctx_array = array('http' =>
array(
'method' => 'GET',
'timeout' => 10,
)
);
$ctx = stream_context_create($ctx_array);
$output = file_get_contents($url, false, $output); // times out for certain requests
echo $output;
When I set $_GET['url'] to http://www.ucomc.net, the script fails. With most other URLs, it works fine.
I have checked other answers on here and other places but none of them describe my issue, nor do the solutions offered solve it.
I've seen some suggestions to similar problems that involve changing the user agent, but when I do this it not only does not solve the existing problem but prevents other sites from loading as well. I do not want to rely on third-party proxies (don't trust the free ones/want to deal with their query limit and don't want to pay for the expensive ones)
Turns out that it was just a problem with the firewall. Testing it on a PHP sandbox worked fine, so I just had to modify the outgoing connections settings in the server firewall to allow the request through.
I'm trying to download the contents of a web page using PHP.
When I issue the command:
$f = file_get_contents("http://mobile.mybustracker.co.uk/mobile.php?searchMode=2");
It returns a page that reports that the server is down. Yet when I paste the same URL into my browser I get the expected page.
Does anyone have any idea what's causing this? Does file_get_contents transmit any headers that differentiate it from a browser request?
Yes, there are differences -- the browser tends to send plenty of additionnal HTTP headers, I'd say ; and the ones that are sent by both probably don't have the same value.
Here, after doing a couple of tests, it seems that passing the HTTP header called Accept is necessary.
This can be done using the third parameter of file_get_contents, to specify additionnal context informations :
$opts = array('http' =>
array(
'method' => 'GET',
//'user_agent ' => "Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2) Gecko/20100301 Ubuntu/9.10 (karmic) Firefox/3.6",
'header' => array(
'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*\/*;q=0.8
'
),
)
);
$context = stream_context_create($opts);
$f = file_get_contents("http://mobile.mybustracker.co.uk/mobile.php?searchMode=2", false, $context);
echo $f;
With this, I'm able to get the HTML code of the page.
Notes :
I first tested passing the User-Agent, but it doesn't seem to be necessary -- which is why the corresponding line is here as a comment
The value is used for the Accept header is the one Firefox used when I requested that page with Firefox before trying with file_get_contents.
Some other values might be OK, but I didn't do any test to determine which value is the required one.
For more informations, you can take a look at :
file_get_contents
stream_context_create
Context options and parameters
HTTP context options -- that's the interesting page, here ;-)
replace all spaces with %20
Hello dear Stackoverflowers!
I have a problem with a project, regarding client -> server communication. I want to transfer data from a C++ program to a server. For this, I chose HTTP as communication protocol, because it is easy to handle on webservers via PHP scripts of similar. The C++ program sends data (or commands) via HTTP POSTs to a server, the server generates a plain text response (Mime-Type text/plain) via PHP scripts. The generated responses are relatively short, contain a short success or failure message, and perhaps a little "payload" (all plain text).
Everything seems to work great on my development machine (local Apache server lampp). However, today I tried moving the server PHP scripts for testing purposes on a live webserver (virtual webserver services running Apache + PHP + MySQL) and, well, something stopped working...
The problem
One server-sided PHP script, is used to store data from the C++ application in a MySQL database. The data I want to store in the MySQL database is a raw json string (it is experiment data, that is processed later). The json string is formed by the C++ application. It is approximately 70 kB large (so it is large!) and sent via a POST multipart request to the webserver. The multipart request is formed via libcurl:
foreach (const HttpKeyValuePair& kv, localServerCommand.httpKeyValuePairs) {
if (curl_formadd(&httpPostFirst, &httpPostLast, CURLFORM_PTRNAME, kv.key.c_str(),
CURLFORM_NAMELENGTH, (long) kv.key.size(),
CURLFORM_PTRCONTENTS, kv.value.c_str(),
CURLFORM_CONTENTSLENGTH, (long) kv.value.size(),
CURLFORM_CONTENTTYPE, "text/plain",
CURLFORM_END) != 0) {
cerr << "Error assembling form data" << endl;
}
}
[...]
CURL* curl = curl_easy_init();
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, &receiveData);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, &receiveBuffer);
curl_easy_setopt(curl, CURLOPT_URL, url.c_str());
if (useSimplePost) { // Only true if postString.size() < 200 byte
curl_easy_setopt(curl, CURLOPT_POST, 1);
curl_easy_setopt(curl, CURLOPT_POSTFIELDS, postString.c_str());
} else {
// Mutlipart
curl_easy_setopt(curl, CURLOPT_HTTPPOST, httpPostFirst);
}
curl_easy_setopt(curl, CURLOPT_ERRORBUFFER, curlErrorBuffer);
curl_easy_setopt(curl, CURLOPT_FAILONERROR, 1);
curl_easy_setopt(curl, CURLOPT_TIMEOUT, 30);
CURLcode errorCode = curl_easy_perform(curl);
curl_easy_cleanup(curl);
(Just to resolve any doubts: I checked with wireshark... he is using the mutlipart post. The branch utilizing a simple post is only there, because I thought one could spare the network connection by leaving out the multipart headers for the more frequent, small requests.)
However, now the interesting part: My server-sided scripts never receives the json string. The field that should carry the json string, called 'data', is not part of the $_POST structure in PHP! To make things wierd, all other fields are there anyway. For testing purposes, I dumped the PHP $_GET, $_POST and $_FILES variables on the server in a log file and they look like this for the request in question:
------
GET
------
array (
)
------
POST
------
array (
'id' => '130',
'nonceid' => '4656',
'authentication' => 'fjOynwtBDE/g/llkQlSgrGUx0ttfJMarExF6E3jg0/QeRgzvp+Chr0XqEIzoK6Rm4/19Q6KIA/Lx32Ti1Y+cQhVdF70AS8GaI2i+0FO3Uj7WfFl4FotUzpbyLpD5/AUe0KOiGA==',
)
------
FILES
------
array (
)
When using my local server, the 'data' field is part of $_POST. The 'data' field is the first field that is sent to the server, meaning it is written first via curl_formadd loop above, the first field in the TCP-stream as checked with wireshark, and also the first field that is in the $_POST array on my local server.
Server tests
After discovering this issue, I tested the server and tried to upload a file through WordPress using Firefox, to test if the server just rejects any large $_POST field. However, uploading seems to work (tested to upload a large PNG, larger than the json data I want to upload).
The next test was to make the 'data'-field smaller. I tested to upload ~900 bytes of repetitions of the string
"shorter amount of data with special characters +/=?=?$§+#+\'*>< "
which also worked (using multipart post).
The question
I would like the long 'data'-field to be available as part of the $_POST variable like it is on my development machine. I do not know what is causing the issue. Can it be something with the mutipart mime type I am using ("text/plain")? Are there any configurations limiting POST-FIELD sizes in Apache/PHP (I only know about overall POST size limits)?
I suspect this to be a an exotic server configuration problem. However, I do not know a lot about the long and (if you do not spend time on it) complicated httpd.conf.
Does anyone know what causes this problem and how to reproduce it on my local server? Or even how to resolve this issue?
Thanks in advance!
Well, I found a work-around, but not the reason what is causing this filtering of large POST fields. I first did some additional testing regarding my missing POST field.
As Robbie suggested, I tried spoofing Firefox headers (Agent = "Mozilla/5.0", ...), but no success.
The next step was to zone in the "allowed packet size" which seems to be 65536 (closest interval I tested was 65000-66000 bytes); after that any field size exceeding this limit is gone.
After discovering this, I got frustrated, reopened the API Documentation of libcurl, and changed how data is sent to the server as part of multipart posts. Large post fields are now packed as files, rather as fields, which is the same way browsers do a file upload.
if (kv.value.size() > 2048) {
valueString = "";
if (curl_formadd(&httpPostFirst, &httpPostLast, CURLFORM_PTRNAME, kv.key.c_str(),
CURLFORM_NAMELENGTH, (long) kv.key.size(),
CURLFORM_BUFFERPTR, kv.value.c_str(),
CURLFORM_BUFFERLENGTH, (long) kv.value.size(),
CURLFORM_FILENAME, kv.key.c_str(),
CURLFORM_CONTENTTYPE, "text/plain",
CURLFORM_END) != 0) {
cerr << "Error assembling form data" << endl;
}
On the PHP side, this however means that my data does enter the script as a $_POST field, but as part of the $_FILES structure. However, the message authentication mechanism, I use for only allowing uploads by my program, requires all data in the $_POST structure, and that the sequence of the data fields is not changed. Therefore, in addition to sending my data as a file, I also add an empty field to my POST (via libcurl) with the same name as the file ("data=").
Then in PHP, I added code to my "configuartion script", that is always load first, which reassembles the original $_POST message, from the sent data files and the received $_POST data:
// Reassamble $_POST structure in case parts of it have been sent as file
foreach ($_POST as $pkey => $pvalue) {
if (strlen($pvalue) == 0) {
// Perhaps it was sent as file?
if (isset($_FILES[$pkey])) {
// Yes, as file! - lets read it in and put it back into $_POST
$fileData = $_FILES[$pkey];
if ($fileData["error"] != 0) {
handleError("File transfer of file '" . $pkey . "' failed");
}
$postDataFileHandle = fopen($fileData["tmp_name"], "rb");
if (!$postDataFileHandle) {
handleError("Cannot read required \$_POST field that was sent as file");
}
$data = fread($postDataFileHandle, $fileData["size"]);
if ($data === FALSE) {
handleError("Error on reading required \$_POST field that was sent as file");
}
fclose($postDataFileHandle);
$_POST[$pkey] = $data;
}
}
}
This works, but I still do not know, why/how POST-fields are filtered by Apache. Perhaps it is not done by Apache, but my hosting provider does some sort of deep packet filtering.
Still confused but it works now, thanks to everyone who tried to help!
I'm trying to POSTing some data (a JSON string) from a php script to a java server (all written by myself) and getting the response back.
I tried the following code:
$url="http://localhost:8000/hashmap";
$opts = array('http' => array('method' => 'POST', 'content' => $JSONDATA,'header'=>"Content-Type: application/x-www-form-urlencoded"));
$st = stream_context_create($opts);
echo file_get_contents($url, false,$st);
Now, this code actually works (I get back as result the right answer), but file_get_contents hangs everytime 20 seconds while being executed (I printed the time before and after the instruction). The operations performed by the server are executed in a small amount of time, and I'm sure it's not normal to wait all this time to get the response.
Am I missing something?
Badly mis-configured server maybe that doesn't send the right content-size and using HTTP/1.1.
Either fix the server or request the data as HTTP/1.0
Try adding Connection: close and Content-Length: strlen($JSONDATA) headers to the $opts.
Also, if you want to avoid using extensions, have a look at this class I wrote some time ago to perform HTTP requests using PHP core only. It works on PHP4 (which is why I wrote it) and PHP5, and the only extension it ever requires is OpenSSL, and you only need that if you want to do an HTTPS request. Documented(ish) in comments at the top.
Supports all sorts of stuff - GET, POST, PUT and more, including file uploads, cookies, automatic redirect handling. I have used it quite a lot on a platform I work with regularly that is stuck with PHP/4.3.10 and it works beautifully... Even if I do say so myself...