I want to pass a function a string, which takes that string tacks it onto url. Then goes to that url and then returns the page to my server so I can manipulate it with JS.
Any Ideas would be much appreciated.
cheers.
If your fopen_wrappers are enabled, you can use file_get_contents() to retrieve the page, and then insert JavaScript into the content before echoing it as output.
$content = file_get_contents('http://example.com/page.html');
if( $content !== FALSE ) {
// add your JS into $content
echo $content;
}
This of course won't affect the original page.
You should be able to use fopen() for what you want. It can accept URLs.
echo "<script type='text/javascript' src='myjavascript.js'></script>";
$handle = #fopen("http://www.example.com/", "r");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
echo $buffer;
}
fclose($handle);
}
Using CURL would probably be easiest but I prefer to do stuff myself. This will connect to a given address and return the contents of the page. It will also return the headers, though, so watch out for that:
function do_request ($host, $path, $data, $request, $specialHeaders=null, $type="application/x-www-form-urlencoded", $protocol="", $port="80")
{
$contentlen = strlen($data);
$req = "$request $path HTTP/1.1\r\nHost: $host\r\nContent-Type: $type\r\nContent-Length: $contentlen\r\n";
if (is_array($specialHeaders))
{
foreach($specialHeaders as $header)
{
$req.=$header;
}
}
$req.="Connection: close\r\n\r\n";
if ($data != null) {
$req.=$data;
}
$fp = fsockopen($protocol.$host, $port, $errno, $errstr);
if (!$fp) {
throw new Exception($errstr);
}
fputs($fp, $req);
$buf = "";
if (!feof($fp)) {
$buf = #fgets($fp);
}
return $buf;
}
Related
I need to transfer files of any type or size over HTTP/GET in ~1k chunks. The resulting file hash needs to match the source file. This needs to be done in native PHP without any special tools. I have a basic strategy but I'm getting odd results. This proof of concept just copies the file locally.
CODE
<?php
$input="/home/lm1/Music/Ellise - Feeling Something Bad.mp3";
$a=pathinfo($input);
$output=$a["basename"];
echo "\n> ".md5_file($input);
$fp=fopen($input,'rb');
if ($fp) {
while(!feof($fp)) {
$buffer=base64_encode(fread($fp,1024));
// echo "\n\n".md5($buffer);
write($output,$buffer);
}
fclose($fp);
echo "\n> ".md5_file($output);
echo "\n";
}
function write($file,$buffer) {
// echo "\n".md5($buffer);
$fp = fopen($file, 'ab');
fwrite($fp, base64_decode($buffer));
fclose($fp);
}
?>
OUTPUT
> d31e102b1cae9c73bbf5a12615a8ea36
> 9f03f6c88ed61c07cb534922d6d31864
Thanks in advance.
fread already advances the file pointer position, so there's no need to keep track of it. Same with frwite, so consecutive calls automatically append to the given file. Thus, you could simplify your approach to (code adapted from this answer on how to efficiently write a large input stream to a file):
$src = "a.test";
$dest = "b.test";
$fp_src = fopen($src, 'rb');
if ($fp_src) {
$fp_dest = fopen($dest, 'wb');
$buffer_size = 1024;
while(!feof($fp_src)) {
fwrite($fp_dest, fread($fp_src, $buffer_size));
}
fclose($fp_src);
fclose($fp_dest);
echo md5_file($src)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
echo md5_file($dest)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
}
If you want to keep both processes separated, you'd do:
$src = "a.test";
$dest = "b.test";
if (file_exists($dest)) {
unlink($dest); // So we don't append to an existing file
}
$fp = fopen($src,'rb');
if ($fp) {
while(!feof($fp)){
$buffer = base64_encode(fread($fp, 1024));
write($dest, $buffer);
}
fclose($fp);
}
function write($file, $buffer) {
$fp = fopen($file, 'ab');
fwrite($fp, base64_decode($buffer));
fclose($fp);
}
echo md5_file($src)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
echo md5_file($dest)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
As for how to stream files over HTTP, you might want to have a look at:
Streaming a large file using PHP
Here has a question: I need execute a task to put many data to another mysql database per minute; if the first task hasn't finish, the second has start; so,there has a multiple concurrent problem; how to resolve the problem??
I have some ideas, first, Let the task has a execute-time which less than the start time of next task;second, let the task support multi-process; but,i don't the how to write the code?
public function execute(Input $input, Output $output)
{
$tele_data = Telesales::field('*')->where([['create_time','<',time()-48*3600],['customer_label','in',[2,6,7]],['virtual_sale','=','0']])->whereRaw('phone is not null')->select()->toArray();
foreach($tele_data as $key=>$value) {
static::pushTeleToIdc($value);
}
}
private static function pushTeleToIdc($data = []) {
$res = Telesales::where('id',$value['id'])->update(['virtual_sale'=>'1']);
if(!$res) {
return;
}
$url = config('idc.tele_url');
$key = config('idc.tele_key');
$channel = config('idc.tele_channel');
$time = time();
$sign = md5($key.$channel.$time);
$urls = $url."?channel=".$channel."&sign=".$sign."&time=".$time;
$require_params = config('idc.require_params');
foreach($require_params as $key=>$value) {
if(array_key_exists($key,$data) && !empty($data[$key])) {
$d[$key] = $data[$key];
}else{
$d[$key] = empty($value)?'':$value[array_rand($value,1)];
}
}
$d['register_time'] = $d['create_time'];
$res = post_url($urls,$d);
$result = json_decode($res,true);
if (isset($result['code']) && $result['code'] != 0){
Log::init(['single'=>'tpushidc'])->error($res);
}
}
Could you help me resolve the problem?
The easiest thing to do is to setup a flag to tell that the process is already in progress and check if that's the case at the start of the function. I don't know how you want to setup the visibility of your code, so I leave it to you to extract $myFile to the file/class scope (same goes for the file path, you probably want to use some /var or /log folder for such stuff).
So the gist is: we create a file, if it doesn't exist or there is a 0 in it - it means we can start working. On other hand, if the contents of the file is 1, the process will die and it will be so every time you run it, until the first one finishes and rewrites the contents of the file to 0 (which means the process is not in progress anymore).
public function execute(Input $input, Output $output)
{
if ($this->isProcessInProgress()) {
die('Process is in progress');
}
$this->startProcess();
$tele_data = [...];
foreach($tele_data as $key=>$value) {
static::pushTeleToIdc($value);
}
$this->finishProcess();
}
private function isProcessInProgress() {
$myFile = 'tele_to_idc_process.txt';
$handle = fopen($myFile, 'r');
if (!$handle)
return false;
$status = fread($handle, 1);
fclose($handle);
return (bool) $status;
}
private function startProcess() {
$myFile = 'tele_to_idc_process.txt';
$handle = fopen($myFile, 'w');
if (!$handle)
return;
$status = fwrite($handle, '1');
fclose($handle);
}
private function finishProcess() {
$myFile = 'tele_to_idc_process.txt';
$handle = fopen($myFile, 'w');
if (!$handle)
return;
$status = fwrite($handle, '0');
fclose($handle);
}
You might get a warning if the file doesn't exist, you can suppress it with #fopen instead of fopen
I'm trying to use Mailchimp's Export API to generate a CSV file of all members of a given list. Here' the documentation and the example PHP code they give:
$apikey = 'YOUR_API_KEY';
$list_id = 'YOUR_LIST_ID';
$chunk_size = 4096; //in bytes
$url = 'http://us1.api.mailchimp.com/export/1.0/list?apikey='.$apikey.'&id='.$list_id;
/** a more robust client can be built using fsockopen **/
$handle = #fopen($url,'r');
if (!$handle) {
echo "failed to access url\n";
} else {
$i = 0;
$header = array();
while (!feof($handle)) {
$buffer = fgets($handle, $chunk_size);
if (trim($buffer)!=''){
$obj = json_decode($buffer);
if ($i==0){
//store the header row
$header = $obj;
} else {
//echo, write to a file, queue a job, etc.
echo $header[0].': '.$obj[0]."\n";
}
$i++;
}
}
fclose($handle);
}
This works well for me and when I run this file, I end up with a bunch of data in this format:
Email Address: xdf#example.com
Email Address: bob#example.com
Email Address: gerry#example.io
What I want to do is turn this into a CSV (to pass to a place on my server) instead of echoing the data. Is there a library or simple syntax/snippit I can use to make this happen?
If the format simply like:
Email Address, xdf#example.com
Email Address, bob#example.com
Email Address, gerry#example.io
is what you after, then you can do:
$handle = #fopen($url,'r');
$csvOutput = "";
if (!$handle) {
echo "failed to access url\n";
} else {
$i = 0;
$header = array();
while (!feof($handle)) {
$buffer = fgets($handle, $chunk_size);
if (trim($buffer)!=''){
$obj = json_decode($buffer);
if ($i==0){
//store the header row
$header = $obj;
} else {
//echo, write to a file, queue a job, etc.
echo $header[0].', '.$obj[0]."\n";
$csvOutput .= $header[0].', '.$obj[0]."\n";
}
$i++;
}
}
fclose($handle);
}
$filename = "data".date("m.d.y").".csv";
file_put_contents($filename, $csvOutput);
The variable $csvOutput contains the CSV format string.
This ones on me. From now on you might want to actually read some documentation instead of copying and pasting your way through life. other's will not be as nice as i am. here's a list of file system functions from the php website. http://php.net/manual/en/ref.filesystem.php Getting the file output to the desired csv format is an exercise left to the reader.
$apikey = 'YOUR_API_KEY';
$list_id = 'YOUR_LIST_ID';
$chunk_size = 4096; //in bytes
$url = 'http://us1.api.mailchimp.com/export/1.0/list?apikey='.$apikey.'&id='.$list_id;
/** a more robust client can be built using fsockopen **/
$handle = #fopen($url,'r');
if (!$handle) {
echo "failed to access url\n";
} else {
$i = 0;
$header = array();
$output = ''; //output buffer for the file we are going to write.
while (!feof($handle)) {
$buffer = fgets($handle, $chunk_size);
if (trim($buffer)!=''){
$obj = json_decode($buffer);
if ($i==0){
//store the header row
$header = $obj;
} else {
//write data into our output buffer for the file
$output .= $header[0].': '.$obj[0]."\n";
}
$i++;
}
}
fclose($handle);
//now write it to file
$path = '/path/to/where/you/want/to/store/file/';
$file_name = 'somefile.csv';
//create a file resource to write to.
$fh = fopen($path.$file_name,'w+');
//write to the file
fwrite($fh,$output);
//close the file
fclose($fh);
}
I want to retrieve email from gmails' imap server but the problem is that the responses from the server are multiple lines long (as demonstrated here) and fgets only retrieves one line.
I've tried using fgets, fread, socket_read but none of them work so either i'm using the wrong method or using the methods incorrectly. I also tried this tutorial but it didn't work either. I would appreciate if someone could help me with this.
Thanks and i'm really sorry if this is an amateur question.
Code:
<?php
$stuff = fsockopen('ssl://imap.gmail.com',993);
$reply = fgets($stuff,4096);
echo 'connection: '.$reply.'<br/>';
$request = fputs($stuff,"a1 LOGIN MyUserName Password\r\n");
$receive = socket_read($stuff, 4096);
echo 'login: '.$receive.'<br/>';
$request = fputs($stuff,"a2 EXAMINE INBOX\r\n");
$reply = '';
while(!feof($stuff))
$reply .= fread($stuff, 4096);
echo $reply;
/*
$request = fputs($stuff,'a3 FETCH 1 BODY[]\r\n');
$reply = fgets($stuff);
echo $reply;
*/
?>
Max's answer below works. This is my implementation of it.
private function Response($instructionNumber)
{
$end_of_response = false;
while (!$end_of_response)
{
$line = fgets($this->connection,self::responseSize);
$response .= $line.'<br/>';
if(preg_match("/$instructionNumber (OK|NO|BAD)/", $response,$responseCode))
$end_of_response = true;
}
return array('code' => $responseCode[1],
'response'=>$response);
}
Generally, you know to stop reading when you get the OK/BAD/NO response for the tag you sent. If you send a1 LOGIN ... you stop when you get a1 OK/BAD/NO ....
It's been a while since I wrote PHP, and I don't know that much about IMAP, but if it's anything like NNTP, your code would look a bit like this (wrote it in the SO editor, might be bugged) :
$buffer = '';
function read_line($socket) {
global $buffer;
while (strpos($buffer, "\n") === false)
$buffer .= fread($socket, 1024);
$lineEnd = strpos($buffer, "\n");
$line = substr($buffer, 0, $lineEnd-1);
$buffer = substr($buffer, $lineEnd);
return $line;
}
function send_line($socket, $line) {
fwrite($socket, $line);
}
$socket = fsockopen('ssl://imap.gmail.com',993);
$welcome = read_line($socket);
send_line("a1 LOGIN MyUserName Password\r\n");
$reply = read_line($socket);
send_line("a2 EXAMINE INBOX\r\n");
while (($reply = trim(read_line($socket))) != '.') {
echo $reply.PHP_EOL;
}
echo "Done";
The basic concepts are :
Always buffer all incoming data. PHP doesn't handle lines very well, so do the splitting yourself.
Don't randomly read everything, but know what to expect. You expect one welcome line, LOGIN has one reply, and EXAMINE INBOX keeps outputting data until there's a single dot, so immediately stop reading once you see that.
You'll most likely want a simple function to take care of the reading. You could even write another function to make it easy:
function read_block($socket) {
$block = '';
while ('.' != trim($reply = read_line($socket)) {
$block .= $reply;
}
return $block;
}
I need to retrieve a small amount of data from a very large remote XML file that I access via http. I only need a portion of the file at the beginning, but the files I am accessing can often be so large that downloading them all will cause a timeout. It seems like it should be possible with fsockopen to pull only as much as needed before closing the connection, but nothing I have tried has worked.
Below is a simplified version of what I have been trying. Can anyone tell me what I need to do differently?
<?php
$k = 0;
function socketopen($funcsite, $funcheader){
$fp = fsockopen ($funcsite, 80, $errno, $errstr, 5);
$buffer = NULL;
if ($fp) {
fwrite($fp, "GET " . $funcheader . " HTTP/1.0\r\nHost: " . $funcsite. "\r\n\r\n");
while (!feof($fp)) {
$buffer = fgets($fp, 4096);
echo $buffer;
if($k == 200){
break;
}
$k++;
}
fclose ($fp);
} else {
print "No Response:";
}
return ( html_entity_decode($buffer));
}
$site = "www.remotesite.com";
$header = "/bigdatafile.xml";
$data = socketopen($site, $header);
?>
This works fine, but always opens and downloads the entire remote file. (I actually use a different conditional than the if($k = x), but that shouldn't matter).
Any help greatly appreciated. -Jim
Any reason not to use file_get_contents() instead?
$buffer = html_entity_decode(file_get_contents('http://www.remotesite.com/bigdatafile.xml', 0, null, $offsetBytes, $maxlenBytes));
You just need to specify $offsetBytes and $maxlenBytes.
Try this:
set_time_limit(0);
echo $buffer = html_entity_decode(file_get_contents('http://www.remotesite.com/bigdatafile.xml', 0, null, 1024, 4096));
with this code you could download the entire rss
if (!$xml = simplexml_load_file("http://remotesite.com/bigrss.rss))
{
throw new RuntimeException('Unable to load or parse feed');
}
else
{
file_put_contents($xml,'mybigrss.rss');
}
but if you want to get just some parts then do the following;
$limit = 512000; // set here a limit
$sourceData = fread($s_handle,$limit);
// your code ect..
Or with eof
$source='';
while (!feof($s_handle))
$source.=fread($s_handle,1024); // set limit