I have this function in controller ImportController.php, I seding a file to import using the Laravel Excel library...
Laravel version: 5.5
Php Version: PHP 7.2.19-0ubuntu0.18.10.1 (cli) (built: Jun 4 2019 14:46:43) ( NTS )
One sheet the client is sending is taking a lot of time. Then I make a progress bar for the client get a feedback when is importing/saving data from the sheet.
This is the javascript function here is sending the sheet and token and if is a ajax request.
.....
let fd = new FormData();
fd.append(_archiveInputId, archivoSeleccionado.files[0]);
fd.append("_token", window.Laravel.csrfToken);
fd.append("isHttpRequest", true);
let xmlHTTP = new XMLHttpRequest();
xmlHTTP.upload.addEventListener("progress", progressFunction, false);
xmlHTTP.addEventListener("load", transferCompleteFunction, false);
xmlHTTP.addEventListener("error", uploadFailed, false);
xmlHTTP.addEventListener("abort", uploadCanceled, false);
xmlHTTP.responseType = 'json';
xmlHTTP.response = 'json';
xmlHTTP.open("POST", _url, true);
xmlHTTP.send(fd);
.....
/**
* Progress Function
* #param evt
*/
function progressFunction(evt) {
console.log(evt);
}
This is my controller:
public function import(ImportFormRequest $request)
{
$isHttpRequest = $request->has('isHttpRequest') ? $request->has('isHttpRequest') : false;
if($isHttpRequest){
echo "creating EventsImport class";
flush();
sleep(1);
}
$import = new EventsImport(EventsImport::TYPE_SIMPLE, $request->file('sheet'));
try{
if($isHttpRequest){
echo "importing data from sheet";
flush();
sleep(1);
}
Excel::import($import, $request->file('sheet'));
}catch (\Exception $e){
return $this->returnJsonResponseHttpRequest("nok","Erro ao realizar importação! Erro: ".$e->getMessage());
}
}
These outputs echo "importing data from sheet" is for testing.
I tried with:
ob_flush();
flush();
php.ini -> output_buffering=Off
.htaccess -> mod_env.c -> SetEnv no-gzip 1
But none worked (in laravel). In tests outside Laravel ob_flush and flush works fine.
Anyone have any idea?
After a long time and a travel along mountains and 5 cigarettes.
In PHP controller function add:
$response = new StreamedResponse(function() use ($request) {
for($i = 0; $i <= 3; $i++){
echo "Data: ".json_encode(['teste' => "teste"])."\n\n";
flush();
}
});
$response->headers->set('Content-Type', 'text/event-stream');
$response->headers->set('X-Accel-Buffering', 'no');
$response->headers->set('Cach-Control', 'no-cache');
$response->send();
sleep(5);
In javascript function to start the XMLHttpRequest:
// Remove the responseType and response as json.
let fd = new FormData();
fd.append(_archiveInputId, archivoSeleccionado.files[0]);
fd.append("_token", window.Laravel.csrfToken);
fd.append("isHttpRequest", true);
let xmlHTTP = new XMLHttpRequest();
xmlHTTP.seenBytes = 0;
xmlHTTP.onreadystatechange = function(evt) {
console.log("%c Content: ", 'color: red;', evt.srcElement.response);
};
xmlHTTP.upload.addEventListener("progress", progressFunction, false);
xmlHTTP.addEventListener("load", transferCompleteFunction, false);
xmlHTTP.addEventListener("error", uploadFailed, false);
xmlHTTP.addEventListener("abort", uploadCanceled, false);
xmlHTTP.open("POST", _url, true);
xmlHTTP.send(fd);
Related
I am posting data to google sheets using php, everytime I post the data it get added on the sheet correctly, but i get the error message as
We8d#39;re sorry, a server error occurred. Please wait a bit and try
again.
My php curl post code is:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $gsurl);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
$output = curl_exec($ch);
$info = curl_getinfo($ch);
curl_close($ch);
$res = json_decode($output, 1);
My GS App Script is :
var SCRIPT_PROP = PropertiesService.getScriptProperties(); // new property service
function doGet(e){
return handleResponse(e);
}
function doPost(e){
return handleResponse(e);
}
function handleResponse(e) {
var lock = LockService.getPublicLock();
lock.waitLock(30000);
try {
var doc = SpreadsheetApp.openById(SCRIPT_PROP.getProperty("key"));
var sheet = doc.getSheetByName(SHEET_NAME);
var headRow = e.parameter.header_row || 1;
var headers = sheet.getRange(1, 1, 1, sheet.getLastColumn()).getValues()[0];
var nextRow = sheet.getLastRow()+1;
var row = [];
for (i in headers){
if (headers[i] == "Timestamp"){
row.push(new Date());
} else {
row.push(e.parameter[headers[i]]);
}
}
sheet.getRange(nextRow, 1, 1, row.length).setValues([row]);
// return json success results
return ContentService
.createTextOutput(JSON.stringify({"result":"success", "row": nextRow}))
.setMimeType(ContentService.MimeType.JSON);
} catch(e){
return ContentService
.createTextOutput(JSON.stringify({"result":"error", "error": e}))
.setMimeType(ContentService.MimeType.JSON);
} finally {
lock.releaseLock();
}
}
function setup() {
var doc = SpreadsheetApp.getActiveSpreadsheet();
SCRIPT_PROP.setProperty("key", doc.getId());
}
Please help me.
I have been stuck on this for over a week and I think I am long overdue for asking on here.. I am trying to get my users to upload their video files using the jQuery File Upload Plugin. We do not want to save the file on our server. The final result is having the file saved in our Backlot using the Ooyala API. I have tried various approaches and I am successful in creating the asset in Backlot and getting my upload URLs, but I do not know how to upload the file chunks using the URLs into Backlot. I have tried FileReader(), FormData(), etc. I am pasting the last code I had that created the asset, and gave me the upload URLs, but did not save any chunks into Backlot. I assume I may be getting stuck in one of my AJAX calls, but I am not very sure.
I keep getting:
Uncaught InvalidStateError: An attempt was made to use an object that is not, or is no longer, usable.
Here is my page with the JS for the jQuery File Upload widget by BlueImp:
<html>
<head>
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.9.1/jquery.min.js"></script>
<script type="text/javascript" src="<?php print base_path() . path_to_theme() ?>/res/js/jQuery-File-Upload/js/vendor/jquery.ui.widget.js"></script>
<script type="text/javascript" src="<?php print base_path() . path_to_theme() ?>/res/js/jQuery-File-Upload/js/jquery.iframe-transport.js"></script>
<script type="text/javascript" src="<?php print base_path() . path_to_theme() ?>/res/js/jQuery-File-Upload/js/jquery.fileupload.js"></script>
</head>
<body>
<input id="fileupload" type="file" accept="video/*">
<script>
//var reader = FileReader();
var blob;
$('#fileupload').fileupload({
forceIframeTransport: true,
maxChunkSize: 500000,
type: 'POST',
add: function (e, data) {
var goUpload = true;
var ext = ['avi','flv','mkv','mov','mp4','mpg','ogm','ogv','rm','wma','wmv'];
var uploadFile = data.files[0];
var fileName = uploadFile.name;
var fileExtension = fileName.substring(fileName.lastIndexOf('.') + 1);
if ($.inArray( fileExtension, ext ) == -1) {
alert('You must upload a video file only');
goUpload = false;
}
if (goUpload == true) {
$.post('../sites/all/themes/episcopal/parseUploadJSON.php', 'json=' + JSON.stringify(data.files[0]), function (result) {
var returnJSON = $.parseJSON(result);
data.filechunk = data.files[0].slice(0, 500000);
data.url = returnJSON[0];
//reader.onloadend = function(e) {
//if (e.target.readyState == FileReader.DONE) { // DONE == 2
//data.url = returnJSON[0];
// }
//}
//$.each(returnJSON, function(i, item) {
//data.url = returnJSON[0];
//blob = data.files[0].slice(0, 500000);
//console.log(blob);
//reader.readAsArrayBuffer(blob);
//data.submit();
//});
data.submit();
});
}
},//end add
submit: function (e, data) {
console.log(data); //Seems fine
//console.log($.active);
$.post('../sites/all/themes/episcopal/curlTransfer.php', data, function (result) { //fails
console.log(result);
});
return false;
}
});
</script>
</body></html>
Then there is the parseUploadJSON.php code, please keep in mind that my real code has the right Backlot keys. I am sure of this:
<?php
if(isset($_POST['json'])){
include_once('OoyalaAPI.php');
$OoyalaObj = new OoyalaApi("key", "secret",array("baseUrl"=>"https://api.ooyala.com"));
$expires = time()+15*60; //Adding 15 minutes in seconds to the current time
$file = json_decode($_POST['json']);
$responseBody = array("name" => $file->name,"file_name"=> $file->name,"asset_type" => "video","file_size" => $file->size,"chunk_size" => 500000);
$response = $OoyalaObj->post("/v2/assets",$responseBody);
$upload_urls = $OoyalaObj->get("/v2/assets/".$response->embed_code."/uploading_urls");
$url_json_string = "{";
foreach($upload_urls as $key => $url){
if($key+1 != count($upload_urls)){
$url_json_string .= '"' . $key . '":"' . $url . '",';
}else {
$url_json_string .= '"' . $key . '":"' . $url . '"';
}
}
$url_json_string .= "}";
echo $url_json_string;
}
?>
Then I have the curlTransfer.php:
<?php
echo "starting curl transfer";
echo $_POST['filechunk'] . " is the blob";
if(isset($_FILES['filechunk']) && isset($_POST['url'])){
echo "first test passed";
$url = $_POST['url'];
//print_r(file_get_contents($_FILES['filechunk']));
$content = file_get_contents($_FILES['filechunk']);
print_r($content);
$ch = curl_init($url);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt ($ch, CURLOPT_HTTPHEADER, Array("Content-Type: multipart/mixed"));
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "PUT");
curl_setopt($ch, CURLOPT_POSTFIELDS, $content);
try {
//echo 'success';
return httpRequest($ch);
}catch (Exception $e){
throw $e;
}
}
/****Code from Ooyala****/
function httpRequest($ch){
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
$response = curl_exec($ch);
if(curl_error($ch)){
curl_close($ch);
return curl_error($ch);
}
$head=curl_getinfo($ch);
$content = $head["content_type"];
$code = $head["http_code"];
curl_close($ch);
}
?>
And the OoyalaApi.php is here (I saved a copy on my server):
https://github.com/ooyala/php-v2-sdk/blob/master/OoyalaApi.php
I apologize in advance if the code is messy and there's a lot of parts commented out. I have changed this code so much and I cannot get it. I appreciate all of your time and effort.
EDIT
I went back to trying FileReader out as this post Send ArrayBuffer with other string in one Ajax call through jQuery kinda worked for me, but I think it would be safer to read it using readAsArrayBuffer and now I am having trouble saving the array buffer chunks in some sort of array...
We have implemented ooyala file chunk upload in Ruby On Rails by referring this.
We have used the entire JS file as it is from this link.
https://github.com/ooyala/backlot-ingestion-library
I have a listener in an android application to detect messages sent by the server php
public class Threa implements Runnable
{
public static final String SERVERIP = "192.168.1.4";
public static final int SERVERPORT =6060 ; //4444
public BufferedReader in;
public int x=0;
public void run() {
try {
ServerSocket serverSocket = new ServerSocket(SERVERPORT);
while (true)
{
x++;
Socket client = serverSocket.accept();
try {
in = new BufferedReader(new InputStreamReader(client.getInputStream()));
String str = in.readLine();
} catch(Exception e) {
} finally {
client.close();
}
}
} catch (Exception e) {
}
}
}
I found this code on the Internet
function get_url($url)
{
$ch = curl_init();
if($ch === false)
{
die('Failed to create curl object');
}
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
echo get_url('http://www.apple.com/');
using cURL in php to send the message is correct, if not what is the solution to solve this problem, please give me an idea
curl is good to communicate using protocols such as http/ftp etc..
If you want to send random data (ie: using your very own protocol), then you should use php sockets they are made for that :
http://php.net/manual/fr/book.sockets.php
One website I have was originally done in PHP. It does a web POST request to another website every time users do a particular query on the website.
function post_request($url, $data, $referer='') {
$data = http_build_query($data);
$url = parse_url($url);
if ($url['scheme'] != 'http') {
die('Error: Only HTTP request are supported !');
}
// extract host and path:
$host = $url['host'];
$path = $url['path'];
// open a socket connection on port 80 - timeout: 7 sec
$fp = fsockopen($host, 80, $errno, $errstr, 7);
if ($fp){
// Set non-blocking mode
stream_set_blocking($fp, 0);
// send the request headers:
fputs($fp, "POST $path HTTP/1.1\r\n");
fputs($fp, "Host: $host\r\n");
if ($referer != '')
fputs($fp, "Referer: $referer\r\n");
fputs($fp, "User-Agent: Mozilla/5.0 Firefox/3.6.12\r\n");
fputs($fp, "Content-type: application/x-www-form-urlencoded\r\n");
fputs($fp, "Content-length: ". strlen($data) ."\r\n");
fputs($fp, "Connection: close\r\n\r\n");
fputs($fp, $data);
$result = '';
while(!feof($fp)) {
// receive the results of the request
$result .= fgets($fp, 128);
}
// close the socket connection:
fclose($fp);
}
else {
return array(
'status' => 'err',
'error' => "$errstr ($errno)"
);
}
// split the result header from the content
$result = explode("\r\n\r\n", $result, 2);
$header = isset($result[0]) ? $result[0] : '';
$content = isset($result[1]) ? $result[1] : '';
// return as structured array:
return array(
'status' => 'ok',
'header' => $header,
'content' => $content
);
}
This approach works trouble-free, only problem being it takes nearly 3 CPUs to support 100 concurrent users with the above code.
Thinking Node.js would be a good way to do this (web request would be async), I did the following. In terms of CPU requirements there was a definite improvement (mostly works with a single CPU, at most 2)
function postPage(postPath, postData, postReferal, onReply, out) {
var post_options = {
host: 'www.somehost.com',
port: '80',
path: postPath,
method: 'POST',
headers: {
'Referer': postReferal,
'Content-Type': 'application/x-www-form-urlencoded',
'Content-Length': postData.length,
'User-Agent': 'Mozilla/5.0 Firefox/3.6.12',
'Connection': 'close'
}
};
// create request
var post_req = http.request(post_options, function (res) {
var reply = '';
res.setEncoding('utf8');
res.on('data', function (chunk) {
reply += chunk;
});
res.on('end', function () {
onReply(reply, out);
});
res.on('error', function (err) {
out.writeHead(500, { 'Content-Type': 'text/html' });
out.end('Error');
});
});
// post the data
post_req.write(postData);
post_req.end();
}
The problem in this case is that it is very fragile and around 20% of the web requests fail. If the user try the query again it works, but not a good experience.
Am using Windows Azure Websites to host both the above solutions.
Now, the questions
Is using PHP for this expected to be taking that much resources, or is it because my code isn't optimal?
What is wrong with my Node code (or Azure), that so many requests fail?
Use the request library
Buffering the entire response
The most basic way is to make a request, buffer the entire response from the remote service (indianrail.gov.in) into memory, and then send that back to the client. However it is worth looking at the streaming example below
Install the needed dependencies
npm install request eyespect
var request = require('request');
var inspect = require('eyespect').inspector({maxLength: 99999999}); // nicer console logging
var url = 'http://www.indianrail.gov.in';
var postData = {
fooKey: 'foo value'
};
var postDataString = JSON.stringify(postData);
var opts = {
method: 'post',
body: postDataString // postData must be a string here..request can handle encoding key-value pairs, see documentation for details
};
inspect(postDataString, 'post data body as a string');
inspect(url, 'posting to url');
request(url, function (err, res, body) {
if (err) {
inspect('error posting request');
console.log(err);
return;
}
var statusCode = res.statusCode;
inspect(statusCode, 'statusCode from remote service');
inspect(body,'body from remote service');
});
Streaming
If you have a response stream to work with you can stream the post data without having to buffer everything into memory first. I am guessing in your example this is the out parameter.
To add some error handling, you can use the async module and repeatedly try the post request until it either completes successfully or the maximum number of attempts is reached
npm install request filed temp eyespect async
var request = require('request');
var inspect = require('eyespect').inspector({maxLength: 99999999}); // nicer console logging
var filed = require('filed');
var temp = require('temp');
var rk = require('required-keys');
var async = require('async');
function postToService(data, cb) {
// make sure the required key-value pairs were passed in the data parameter
var keys = ['url', 'postData'];
var err = rk.truthySync(data, keys);
if (err) { return cb(err); }
var url = data.url;
var postData = data.postData;
var postDataString = JSON.stringify(postData);
var opts = {
method: 'post',
body: postDataString // postData must be a string here..request can handle encoding key-value pairs, see documentation for details
};
var filePath = temp.path({suffix: '.html'});
// open a writable stream to a file on disk. You could however replace this with any writeable stream such as "out" in your example
var file = filed(filePath);
// stream the response to disk just as an example
var r = request(url).pipe(file);
r.on('error', function (err) {
inspect(err, 'error streaming response to file on disk');
cb(err);
});
r.on('end', function (err) {
cb();
});
}
function keepPostingUntilSuccess(callback) {
var url = 'http://www.google.com';
var postData = {
fooKey: 'foo value'
};
var data = {
url: url,
postData: postData
};
var complete = false;
var maxAttemps = 50;
var attempt = 0;
async.until(
function () {
if (complete) {
return true;
}
if (attempt >= maxAttemps) {
return true;
}
return false;
},
function (cb) {
attempt++;
inspect(attempt, 'posting to remote service, attempt number');
postToService(data, function (err) {
// simulate the request failing 3 times, then completing correctly
if (attempt < 3) {
err = 'desired number of attempts not yet reached';
}
if (!err) {
complete = true;
}
cb();
});
},
function (err) {
inspect(complete, 'done with posting, did we complete successfully?');
if (complete) {
return callback();
}
callback('failed to post data, maximum number of attempts reached');
});
}
keepPostingUntilSuccess(function (err) {
if (err) {
inspect(err, 'error posting data');
return;
}
inspect('posted data successfully');
});
I want to read the content of a URL in javascript. The URL is not on my domain so I need a middle layer that can access cross domain.
I tried to use a PHP function to read the URL and return the result to javascript using jquery but it didn't work.
Here is my trial:
I created a php file named "phpjs_test.php"
<?php
function get_data(){
$url='http://asmary.dreameg.com/texttable.txt';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
$content = curl_exec($ch);
$content = htmlspecialchars($content);
curl_close($ch);
$content = nl2br($content);
return $content;
}
?>
and this is the javascript code:
<script>
$(document).ready(function () {
//httpQuery("http://asmary.dreameg.com/texttable.txt");
getOutput();
});
function getRequest() {
var req = false;
try {
// most browsers
req = new XMLHttpRequest();
} catch (e) {
// IE
try {
req = new ActiveXObject("Msxml2.XMLHTTP");
} catch (e) {
// try an older version
try {
req = new ActiveXObject("Microsoft.XMLHTTP");
} catch (e) {
return false;
}
}
}
return req;
}
function getOutput() {
var ajax = getRequest();
ajax.onreadystatechange = function() {
if (ajax.readyState == 4) {
document.getElementById('output').innerHTML = ajax.responseText;
}
};
ajax.open("GET", "phpjs_test.php", true);
ajax.send(null);
}
I'm completely new to PHP so I don't know even the PHP function is correct or not.
You should just use jQuery ajax methods instead of creating XMLHTTPRequest you don't have to bother with adding more code for IE plus you're already loading the jQuery library. Also if you set the header to Allow-Origin-Access in your PHP file and specify the other domain you're requesting from then you can make an AJAX request and get the response otherwise it will return nothing or in your dev tools network tab it will show a 403 - Forbidden.
Access-Control-Allow-Origin syntax
Change the PHP file to:
<?php
$url='http://asmary.dreameg.com/texttable.txt';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$content = curl_exec($ch);
curl_close($ch);
$content = explode('~',$content);
foreach($content as $c)
{
$records[] = explode('|',$c);
}
$content = json_encode($records);
echo $content;
?>
Javascript will receive a json array like this
[["1","name1","10","city1"],["2","name2","20","city2"],["3","name3","30","city3"],["4","name4","40","city4"],["5","name5","50","city5"],["6","name6","60","city6"],["7","name7","7","city7"],["8","name8","80","city8"],["9","name9","90","city9"],["10","name10","100","city10"],["11","name11","11","city11"],["12","name12","12","city12"],["13","name13","13","city13"],["14","name14","14","city14"],["15","name15","15","city15"],["16","name16","16","city16"],["17","name17","17","city17"],["18","name18","18","city18"],["19","name19","19","city19"],["20","name20","20","city20"],[""]]