I have a copy of 350+ images on my sever, when a person tries to view one, if it is more than 5 minutes old, I want my sever to check with another website that I am mirroring data from(they insist on me mirroring instead of hotlinking) and get the newest copy. Any thoughts on how to do this?
I can do a cron script and get all of the images, but there are problems doing that.(My host limits me to once every 15 minutes, I would have to get a lot of images that my users may or may not actually view.)
I am thinking there should be a way to do this in PHP, but I have no idea where I would start.
You could serve the images via a php script that allows you to do the necessary checks before displaying the image.
<img src="/index.php/image-name.jpg">
Below is one option for the checks
// get the image name from the uri
$image = explode("/", $_SERVER['REQUEST_URI'])[2];
// check if the image exists
if (is_file($image)) {
// get the file age
$age = filemtime($image);
if ($age < time() - (60*5)) { // 5 mins old
// file too old so check for new one
// do your check and serve the appropriate image
}
else
{
// get the image and serve it to the user
$fp = fopen($image, 'rb');
// send the right headers
header("Content-Type: image/jpg");
header("Content-Length: " . filesize($image));
// dump the picture and stop the script
fpassthru($fp);
exit();
}
}
else
{
// handle error
}
You can apply ajax at your project.
call your server at every 5 minutes using ajax and refresh your content.
In short; AJAX is about loading data in the background and display it on the webpage, without reloading the whole page.
Related
I have a question about the application generate QR code image.
I have an application when clients click a button there will generate a QR code image, my way is store in the project library, then print <img> with the url to the screen. then clients can see it.
But I have a doubt, if there are multi clients using the QR code at the same time, whether there will get a mix?
my code is bellow:
function generate_qrcode($url){
$filename = 'hante_qrcode.png';
$errorCorrectionLevel = 'L';
$matrixPointSize = 4;
//generate QR code image
$o = QRcode::png($url, $filename, $errorCorrectionLevel, $matrixPointSize, 2);
echo "<pre>";
print_r($o);
print_r('<img src="hante_qrcode.png">');
}
if there get mix, how to solve this problem?
But I have a doubt, if there are multi clients using the QR code at the same time, whether there will get a mix?
yes
how to solve this problem?
there are two ways to solve this problem
you can provide unique name for every files like using timestamp using time() function or with user ID. cause as per you are passing parameters while generating qr code you need to store the file. without saving file also possible but in that case you can't configure pixel size and frame size. you can refer this for PHP QR code-Examples
don't store image on server and find some js to generate qr code directly from client side.
having a one demo for that check if you can use it
var qrcode = new QRCode("qrcode");
qrcode.makeCode('https://stackoverflow.com');
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
<script src="https://cdn.rawgit.com/davidshimjs/qrcodejs/gh-pages/qrcode.min.js"></script>
<div id="qrcode"></div>
Of course it will be overwritten.
Solution 1
Create unique filename for every image. This way you can save your images for use later. Another benefit of this, you don't have to create image again for same url.
$filename = md5($url) . ".png";
if(!file_exists($filename)){
$o = QRcode::png($url, $filename, ...);
}
echo '<img src="'.$filename.'">';
Solution 2
If you don't want to save images for disk space reasons you can serve image directly. In your code, user sends request to index.php and fetch image address as response. After then browser makes another request to get image. You can return image rather than returning html.
// image.php
// Still we want to give uniqe filename because we can get another request while one request is processing
$filename = md5(microtime) . "_qr.png";
$o = QRcode::png($url, $filename, ...);
$image = file_get_contents($filename);
// remove the file after stored in a variable
unlink($filename);
header('Content-Type: image/jpeg');
header('Content-Length: ' . filesize($image));
echo $image;
// index.html
<img src="image.php?url=someurl">
So I am working on an uploader for one of our websites, and one of the things I am trying to achieve is a way of uploading thousands of images via the website, instead of our clients using an FTP/SFTP solution.
One of the issues I am running into is upload speeds, so here is the current user flow:
The client clicks the 'Add Images' button and selects the images they wish to upload.
There is a #change set for the input, which processes the images for Vue, by taking the data from the event.target.files array and adding them into the Vue data so that we can display the content.
There is a tick loop running, that checks the first 10 images and preloads them so the client can see the first 10 image previews, any more would kill the browsers memory. Also, as files get uploaded and removed from the array, this updates the previews for the first 10 images always, so there will always be the first 10 preview images displayed.
Once they are happy with this, our client would then click 'Upload Files' and this would then start the upload, which also is part of the tick loop, and what it does is check if anything is uploading, if not, it will start on the first file in the array.
So now it will set the status of the file as uploading, so it shows on the UI, then it creates a XMLHttpRequest() and set the URL, and create a new FormData object and append the image handle (the File(ID) object) and then any other data that needs to be sent.
I set the request to use POST, and set an onreadystatechange so that I can catch when it finishes, which just basically, sets the file state as uploaded, and then removes it from the array, unless there is an issue, then it moves it to the issues array.
Now I send the request to the server, this will then receive the file in the $_FILES variable, and will resize the image 3 times for various sizes and then save them to the correct place, and then return with a success or failure message.
The main problem stems from the upload, the resize code, is fairly quick, I mean around 200-500ms, so the issue doesn't stem from there, but the actual upload.
Our internet is around 4MB per second, and using FTP it takes around 300-400ms for a 4MB file, but for the browser, it takes about 2.2s so I am not sure why this is.
I understand that of course FTP/SFTP is a direct upload, using chunks (I think), where as we are making many Ajax requests, so there in itself makes sense to why it is slower, but is there no way to make this upload quicker at all?
Another thing to note, is this is running within Joomla also.
I am using the below code (amended for me to post):
// Create new request
var http = new XMLHttpRequest();
// Set URL
var url = 'POST_API_URL';
// Create form data object
var params = new FormData();
params.append('name', this.input.files[i].name);
params.append('size', this.input.files[i].size);
params.append('type', this.input.files[i].type);
// Append file to form data object
params.append('images[]', this.input.files[i].handle,
this.input.files[i].name);
// Open post request
http.open("POST", url);
// On return
http.onreadystatechange = function() {
// Check http codes
if (http.readyState == 4 && http.status == 200) {
// Write data to page
window.vm.$data.app.response = http.responseText;
// Get response array
var response = JSON.parse(http.responseText);
// Check response status
if (response.status) {
console.log('Completed');
} else {
console.log('Failed');
}
}
}
// Send request
http.send(params);
The PHP code to receive the file is here:
// Main upload function
public function save()
{
// Wrap everything in a try; catch
try {
// Import system libraries
jimport('joomla.filesystem.file');
jimport('joomla.filesystem.folder');
// Include php image rezing library
require_once JPATH_ROOT . '/lib/php-image-resize/lib/ImageResize.php';
// Decode request data from client
$request = (object) $_POST;
//Define the different sizes and auction we need
$sizes = array('small' => '200', 'medium' => '320', 'xlarge' => '2000');
// Define auction number
$auction = $request->auction;
// Set path for save
$path = $_FILES['images']['tmp_name'][0];
// Create image object
$image = new \Eventviva\ImageResize($path);
// Loop the sizes so we can generate an image for each size
foreach ($sizes as $key => $size) {
// Resize image
$image->resizeToWidth($size);
// Set folder path
$folder = JPATH_ROOT . '/catalogue_images/' . $auction . '/' . $key . '/';
// Check if folder exists, if not; create it
if (!JFolder::exists($folder)) {
JFolder::create($folder);
}
// Set file path
$filepath = $folder . $request->name;
// Save updated file
$image->save($filepath);
}
// Return to the client
echo json_encode(array('status' => true));
} catch(Exception $e) {
// Return error, with message
echo json_encode(array('status' => false, 'message' => $e->getMessage()));
}
}
I am open to any ideas, on how we can either use chunked upload, or anything else, but do keep in mind that our clients can upload 20 up to 5000 images, and we have some clients that do upload around 4000-5000 quite often. So it needs to be robust enough to support this.
My last test, was that:
Time taken: 51 minutes, and 15 seconds
Files: 1000 images (jpg)
Sizes: 1.5MB and 6.5MB
_Also noticed, that it does get slower as time progresses, maybe an extra 500ms to 1s maximum, additional to the 2.2s upload time.
Thanks in advance.
I have an image that is severed by a php script. I call it as such.
<img src="/index.php/image-name.jpg">
If the image is more than 5 minutes old my script will retrieve a new copy of the image from a data provider and then display the new image.
When the website that provides the images has a load and this script goes to fetch a new copy, it will often only display the top part of the image. Firebug will tell me that the image is corrupt or truncated. If I open the image in a new tab, my sever has a full copy. If I run the script a second time within 5 minutes it works perfectly.
It is looking like if it takes more than a certain amount of time to get the image it fails and only shows the top part. Any thoughts on how to make it wait longer before giving up? Or maybe I am completely on the wrong track with what is going wrong.
<?php
// get the image name from the uri
$path= $_SERVER['REQUEST_URI'];
$image = explode("/", $path);
$image=$image[3];//Get the file name
$image=str_replace('%20',' ', $image); //make it all spaces
$localimage='./road_images/'.$image; //where to find the image on the sever
// check if the image exists, this prevents some kinds of attacks
if (is_file($localimage)) {
$age = filemtime($localimage); // get the file age
if ($age < time() - (60*5)) { // 5 mins old
$simage='http://www.someplace/cams/'.$image;
$simage=str_replace(' ', '%20', $simage);//need to remove the spaces for URLs
copy($simage, $localimage);
}
// serve the image to the user.
$fp = fopen($localimage, 'r');
// send the right headers
header("Content-Type: image/jpg");
header("Content-Length: " . filesize($localimage));
// dump the picture and stop the script
fpassthru($fp);
exit();
}
else
{
echo("Error, no such file: '$image'");
}
?>
EDIT: Have discovered that by editing out
header("Content-Length: " . filesize($localimage));
It works as expected. Still trying to figure out why.
That was painful. I was passing the wrong Content-Length header value. Editing out the content length solved it so that it worked just fine. Considering it is static content, I do not know why what I have above does not work.
With more research figured out a way that works.
I put ob_start() near the start.
The new Content-Length header header('Content-Length: ' . ob_get_length());goes at the bottom, just before script exits.
Done that way it works every time and is nice to the browser.
Is there a way in PHP to take some action (mysql insert for example) if there is no new requests for say 1 second?
What I am trying to achieve is to determinate beginning and the end of image sequence sent from a IP camera. Camera sends series of images on detected movement and stops sending when movement stops. I know that camera makes 5 images per second (every 200ms). When there is no new images for more than 1 sec I want to flag last image as end of the sequence, insert a record in mysql, place img in appropriate folder (where all other imgs from the same sequence are already written) and instruct app to make a MJPEG clip of images in that folder.
Right now I am able to determine the first image in the sequence using Alternative PHP cash to save reference time from the previous request but the problem is because next image sequence can happen hours later and I can not instruct PHP to close the sequence if there is NO requests for some time, only when first request of the new sequence arrives.
I really need help on this. My PHP sucks almost as my English... :)
Pseudo code for my problem:
<?php
if(isset($headers["Content-Disposition"]))
{
$frame_time = microtime(true);
if(preg_match('/.*filename=[\'\"]([^\'\"]+)/', $headers["Content-Disposition"], $matches))
{ $filename = $matches[1]; }
else if(preg_match("/.*filename=([^ ]+)/", $headers["Content-Disposition"], $matches))
{ $filename = $matches[1]; }
}
preg_match("/(anpr[1-9])/", $filename, $anprs);
$anpr = $anprs[1];
$apc_key = $anpr."_last_time"; //there are several cameras so I have to distinguish those
$last = apc_fetch($apc_key);
if (($frame_time - $last)>1)
{
$stream = "START"; //New sequence starts
} else {
$stream = "-->"; //Streaming
};
$file = fopen('php://input', 'r');
$temp = fopen("test/".$filename, 'w');
$imageSize = stream_copy_to_stream($file, $temp); //save image file on the file system
fclose($temp);
fclose($file);
apc_store($apc_key, $frame_time); //replace cashed time with this frame time in APC
// here goes mysql stuff...
/* Now... if there is no new requests for 1 second $stream = "END";
calling app with exec() or similar to grab all images in the particular folder and make
MJPEG... if new request arrives cancel timer or whatever and execute script again. */
?>
Could you make each request usleep 1.5 seconds before exiting, and as a last step check to see if the sequence timestamp was updated? If yes, exit and do nothing. If no, save the sequence to mysql. (This will require mutexes, since each http request will be checking and trying to save the sequence, but only one must be allowed to.)
This approach would merge the sub-file/script into the php code (single codebase, easier to maintain), but it can possibly balloon memory use (each request will stay in memory 1.5 seconds, which is a long time for a busy server).
Another approach is to make the sub-file/script into a loopback request on the localhost http server, with presumably a much smaller memory footprint. Each frame would fire off a request to finalize the sequence (similarly again, with mutexes).
Or maybe create a separate service call that checked and saved all sequences, and have a cron job ping it every few seconds. Or have each frame ping it, if a second request can detect that the service is already running it can exit. (share state in the APC cache)
Edit: I think I just suggested what bytesized said above.
What if you just keep the script running for 1 second after it stores the frame to check for more added frames? I imagine you may want to close the connection before the 1 second expires, but tomlgold2003 and arr1 have the answer for you: http://php.net/manual/en/features.connection-handling.php#93441
I think this would work for you:
<?php
ob_end_clean();
header("Connection: close\r\n");
header("Content-Encoding: none\r\n");
ignore_user_abort(true); // optional
ob_start();
// Do your stuff to store the frame
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
ob_end_clean();
// The connection should now be closed
sleep(1);
// Check to see if more frames have been added.
?>
If your server is expected to see a high load, this may not be the answer for you since when receiving 5 frames per second, there will be 5 scripts checking to see if they submitted the last frame.
Store each request from the camera with all its data and timestamp in a file (in php serialized form). In cronjob run (every 10 seconds or so) a script that reads that file and finds requests that have more then one second after them before the following request. Save the data from such requests and delete all other requests.
I have a networked camera that generates a video snapshot upon hitting http://192.0.0.8/image/jpeg.cgi. The problem is that by accessing the root (i.e. 192.0.0.8) directly, users can access a live video feed, so I hope to hide the address altogether.
My proposed solution is to use PHP to retrieve the image and display it at http://intranet/monitor/view.php. Although users could create motion by hitting this new address repeatedly, I see that as unlikely.
I have tried using include() and readfile() in various ways, but do not really use PHP often enough to understand if I'm going in the right direction. My best attempt to date resulted in outputting the jpeg contents, but I did not save the code long enough to share with you.
Any advice would be appreciated.
If you want to limit requests per user then use this:
$timelimit = 30;//Limit in seconds
if(!isset($_SESSION['last_request_time'])) {
$_SESSION['last_request_time'] = time();
}
if(time() > $_SESSION['last_request_time'] + $timelimit) {
//prepare and serve a new image
} else {
//serve an old image
}
If you want to limit image refresh time then use the same script but save the last_request_time in place shared for all users(DB, file, cache)
A succinct way to do this is as follows:
header('Content-Type: image/jpeg');
readfile('http://192.0.0.8/image/jpeg.cgi');
The content of the jpeg is then streamed back to the browser as a file, directly from http://intranet/monitor/view.php.