Cache multiple pages/images from Instagram - php

I'm working on a small project where the users can see images taged by, in this case, "kitties". Instagram only allows 5000 requests/hour, i don't think it will reach this, but i'm choosing to cache any way. Also because i can't figure out how to get the back-link to work.
I can only get the link for next page, then the link for recent page becomes the current page, a link to itself.
Also, the api can return strange number of images, some times 14, some times 20 and so on. I want it to always show 20 images per page and only have 5 pages (100 images). And then update this file each 5/10 minutes or something.
So, my plan is to store like 100 images into a file. I got it working, but it's incredible slow.
The code looks like this:
$cachefile = "instagram_cache/".TAG.".cache";
$num_requests = 0; //Just for developing and check how many request it does
//If the file does not exsists or is older than *UPDATE_CACHE_TIME* seconds
if (!file_exists($cachefile) || time()-filemtime($cachefile) > UPDATE_CACHE_TIME)
{
$images = array();
$current_file = "https://api.instagram.com/v1/tags/".TAG."/media/recent?client_id=".INSTAGRAM_CLIENT_ID;
$current_image_index = 0;
for($i = 0; $i >= 0; $i++)
{
//Get data from API
$contents = file_get_contents($current_file);
$num_requests++;
//Decode it!
$json = json_decode($contents, true);
//Get what we want!
foreach ($json["data"] as $x => $value)
{
array_push($images, array(
'img_nr' => $current_image_index,
'thumb' => $value["images"]["thumbnail"]["url"],
'fullsize' => $value["images"]["standard_resolution"]["url"],
'link' => $value["link"],
'time' => date("d M", $value["created_time"]),
'nick' => $value["user"]["username"],
'avatar' => $value["user"]["profile_picture"],
'text' => $value['caption']['text'],
'likes' => $value['likes']['count'],
'comments' => $value['comments']['data'],
'num_comments' => $value['comments']['count'],
));
//Check if the requested amount of images is equal or more...
if($current_image_index > MAXIMUM_IMAGES_TO_GET)
break;
$current_image_index++;
}
//Check if the requested amount of images is equal or more, even in this loop...
if($current_image_index > MAXIMUM_IMAGES_TO_GET)
break;
if($json['pagination']['next_url'])
$current_file = $json['pagination']['next_url'];
else
break; //No more files to get!
}
file_put_contents($cachefile, json_encode($images));
This feels like a very ugly hack, any ideas for how to make this work better?
Or someone that can tell me how to make that "back-link" to work like it should? (Yes, i could yes js and go -1 in history, but no!).
Any ideas, suggestions, help, comments etc are appreciated.

Why not subscribe to real-time and store the images in the DB? Then, when they are rendered you can check if the image url is valid (check if the photo has been deleted). Getting the data from your own DB will be much faster than from instagram

Related

queing multiple file upload in php

I'm sorry if this has been asked I can't find the correct keywords to look in google. So I decided to ask here in SO.
right now, I have an upload function that is built in laravel where it can upload to 20 images at a time, but client would like to it to be 50 images at a time, so I'm thinking I can't edit php.ini everytime the client wants to increase the maximum uploads at the same time cause it might break the server.
is it possible in php to upload in queue like 10 images for this second next second is 10 images again until the upload are all done so the server won't break.
foreach ($request->file('image') as $key => $file)
{
$filename = md5(time() . uniqid()) . '.' . $file->getClientOriginalExtension();
$imagesize = getimagesize($file);
if( ($imagesize[0] == 533 && $imagesize[1] == 800) == false &&
($imagesize[0] == 800 && $imagesize[1] == 533) == false
) {
$error++;
continue;
}
$file->move('uploads', $filename);
$data['image'] = url('uploads/' . $filename);
$order_id = 1;
$count = PropertyImages::where('property_id', $id)->count();
if( $count > 23 )
{
return response()->json(['success' => false, 'msg' => 'Images must not exceed 24']);
}
$image = PropertyImages::where('property_id', $id)->orderBy('order_id', 'DESC')->first();
if( $image )
{
$order_id = $image->order_id + 1;
}
$item = PropertyImages::create([
'property_id' => $id,
'filename' => $filename,
'order_id' => $order_id
]);
$items[$key]['id'] = $item->id;
$items[$key]['filename'] = url('uploads/' . $item->filename);
}
I am not sure what setup you are using for uploading images, so I assume that you have a plain simple form with 20 file input fields.
PHP.ini limits the size of request (e.g. post_max_size=20M will limit your request to 20MB or max_input_vars = 20 will limit variables in request to 20) so it all depends upon what limit you are using in your PHP.ini
So if you have limited max_input_vars = 20 and you are able to send 20 images in single post request, for sending 50 images to server will need 3 requests (max 20 images per request and 10 for last)
Now if you are using simple plain form then it can't do the job because with click on submit button you will be able to submit your form only once. For this you will have to submit your form using javascript (or jQuery). Using JS you will be able to submit images in multiple AJAX requests.
Here is a jQuery plugin that can do the job very efficiently
https://github.com/zimt28/laravel-jquery-file-upload
Little help for using it with Laravel
https://laracasts.com/discuss/channels/general-discussion/jquery-file-upload-with-laravel
Happy Coding!!

How to update database based on variable api result?

I would like to get your help to get the best and cleanest way to update my video database (MySQL).
Those videos are ordered into folders and I can only grab those folders one by one.
My server in using an external api to grab thousands of videos (each video including an unique id, folder id, title, url...).
As I want to make as less api call as possible, I would like to run a daily cron job to browse each folder and add the new videos, and remove the removed video.
At the moment I can browse all the folders, looping trough each video and adding it to the database.
while (true){
$videos = video_get($oid, $offset, $count);
$videos = $videos['response']['items'];
$results = count($videos);
echo $results . ' || ' . $offset;
if ($results > 0){
$offset += $results;
foreach ($videos as $video) {
$i++;
echo "<br>Number : " . $i;
echo "<br>";
$data = Array (
"id" => $video['id'],
"folder_id" => $video['folder_id'],
"title" => $video['title'],
"url" => $video['url']
);
$db->insert('videos', $data);
}
}
else {break;}
}
What I am asking is, cosidering that there are about 100000 videos, aranged into around 20 folders, would it be better to add all the video and remove the duplicate, or check if the video exist and then add it into the db in case it is a new one?
And more important, what would be the best way to remove the video that wont be present anymore in the api request?
I hope I have been clear enough, but just in case:
Videos are ordered by folder.
Videos can be added/removed at any time on the api side.
I want to keep all the videos into my database, and update them based of if they have been new/removed videos on the api side, in order to avoid having to call the api too often.
Thank you for your help.

Randomly Rotate PHP files in sidebar on page refresh

I have 4 php files which all have a small PHP and jQuery game inside.
The files are as follows:
/game1.php
/game2.php
/game3.php
/game4.php
Every time the page is refreshed I want one of the games to show in the sidebar. When the page is refreshed again, a different game and so.
Is there a way to include files in the sidebar at random via some kind of query on page refresh, if so, could someone please help me with the code. Thanks!
$games = array('game1.php','game2.php','game3.php','game4.php');
session_start();
$used = array();
if (isset($_SESSION['used'])){
$used = $_SESSION['used'];
}
$usable = array_diff($games,$used);
if (sizeof($usable)==0){
$usable = $games;
$_SESSION['used'] = array();
}
$inUse = $usable[array_rand($usable)];
$_SESSION['used'][] = $inUse;
include($inUse);
Try:
include '/game' . rand(1, 4) . '.php';
$FileNames = array ('File1.php','File2.php','File3.php','File4.php'); // Can later be autodetected if file structure gets too big
$FileNames = array_rand($FileNames);
foreach ($FileNames AS $Links){
echo $Links;
}
If you wish to make these clickable:
foreach ($FileNames AS $Links){
echo "".$Links."";
}
$file_array= array('game1.php','game2.php','game3.php','game4.php');
$rand = rand (0, count ($file_array));
include ($file_array[$rand]);
I think you'll find your answer in sessions since you don't want the last viewed game to show up on a next page refresh:
//Fill in array
$games = array('game1' => 'PONG', 'game2' => 'Donkey Kong', 'game3' => 'Patience', 'game4' => 'LoL');
//Delete the current game from the array
unset($games[$_SESSION['currentGame']]);
//Shuffle the array and pick the last one
$game = end(shuffle($games));
include($game.'.php');
//Update session for next page refresh
$_SESSION['currentGame'] = $game;

Creating a new Page of Posts in Wordpress

I am currently trying to make a e-zine using wordpress, and have most of it done. The homepage displays a list of the "pieces" which are included in that edition of the e-zine. I would like to make it so that, when an edition expires (currently using Post Expirator plugin), a new page is created automatically resembling the front page in showing the index of that particular (now expired) edition.
I'm not very experienced using PHP, and still a newbie at wordpress. How could I accomplish this?
The idea is this, you just have to get the expiration date and make a condition with it. You just need to have a basic php skills in order for you to do it. Heres the logic
if($curdate > $expiration_date){ //you can change the condition into ">=" if you want to create a post on and after expiration date
//sample wordpress wp_insert_post
$my_post = array(
'post_title' => 'My post',
'post_content' => 'This is my post.',
'post_status' => 'publish',
'post_author' => 1,
'post_category' => array(8,39)
);
// Insert the post into the database
wp_insert_post( $my_post );
}
for more info visit http://codex.wordpress.org/Function_Reference/wp_insert_post
Here is what I ended up doing, using the suggestions of Felipe as a starting point. There might be a less convulted way of doing this, but, as I said, I'm just a beginner, so this is what I came up with:
First, I created a volnum variable, which keeps track of the current volume number. Then, I cache the front page so that later I can save it as an independent html document:
This is at the beginning of the index.php, before the get_header().
<?php $volnum; ?>
<?php ob_start(); ?>
In the front page, I have an editorial and, next to it, I have the content index. I am saving the editorial tag's (which is always "voln" where 'n' is the volume number) volume number (maybe the foreach is not necessary since the editorial only has one tag) :
<?php $tags = get_the_tags();
foreach ($tags as $tag){
$volnum = $tag->name;
}
?>
Finally, at the end of the document, after the last html, I have added the following code:
<?php
$handle = opendir("past_vol/");
$numOfFiles = count($handle);
$volExists = false;
for($i=0;$i<=$numOfFiles;$i++){
$name = readdir($handle);
if($volnum.".html" == ($name)){
$volExists = true;
continue;
}
}
if($volExists == false){
$cachefile = "past_vol/".$volnum.".html";
$fp = fopen($cachefile, 'w');
fwrite($fp, ob_get_contents());
fclose($fp);
}
closedir($handle);
ob_end_flush();
?>
"past_vol" is the directory where I am saving the past volume html files. So the directory is opened, the amount of files is counted, and a loop that goes through each file's name is started. If a file with the same name as $volnum, then $volExists is true. If at the end of the loop $volExists is false, then it saves the cached page.
Again, it could probably be optimized a whole lot, but for now this works!

Errors while exporting to .xls using PHP

I've asked a similar question previously but Ive been told my question is just me being lazy so let me rephrase.
Ive been using a PHP class script to enable me to export my SQL data to a .xls file but the resultant excel file doesnt display any values and no error is being displayed on the webpage itself.
The class file Im using is documented in the link below:
http://web.burza.hr/blog/php-class-for-exporting-data-in-multiple-worksheets-excel-xml/
And Ive incorporated it in my site as follows
$dbase->loadextraClass('excel.xml');
$excel = new excel_xml();
$header_style = array(
'bold' => 1,
'size' => '14',
'color' => '#000000',
'bgcolor' => '#ffffff'
);
$excel->add_style('header',$header_style);
if(isset($_POST['fec_save']))
{
if($_POST['reporttype']=='films')
{
$films = $dbase->runquery("SELECT datetime,title,country_of_origin,language,runningtime,(SELECT name FROM fec_client WHERE filmid = fec_film.filmid) AS client, (SELECT rating_decision FROM fec_rating_report WHERE filmid = fec_film.filmid) AS rating FROM fec_film WHERE datetime >= '".strtotime($_POST['fromdate'])."' AND datetime <= '".strtotime($_POST['todate'])."'",'multiple');
$filmcount = $dbase->getcount($films);
//continue with excel buildup
$columns = array('Date','Title','Origin','Language','Minutes','Client','Rating');
$excel->add_row($columns,'header');
for($i=1; $i<=$filmcount; $i++)
{
$film = $dbase->fetcharray($films);
$excel->add_row($film['datetime'],$film['title'],$film['country_of_origin'],$film['language'],$film['runningtime'],$film['client'],$film['rating']);
}
$excel->create_worksheet('From_'.str_replace(' ','',$_POST['fromdate']).'_to'.str_replace(' ','',$_POST['todate']));
$xml = $excel->generate();
$excel->download('FilmsClassified_from_'.str_replace(' ','',$_POST['fromdate']).'_to'.str_replace(' ','',$_POST['todate']));
}
}
I would like some assistance as to what I maybe doing wrong.
Thanks
Too long to post as a comment:
$xml = $excel->generate();
creates all the xml file content as a string and stores it in $xml
while
$excel->create_worksheet('From_'.str_replace(' ','',$_POST['fromdate']).'_to'.str_replace(' ','',$_POST['todate']));
creates the xml and directs it to output, with appropriate headers for downloading.
So you're not using the class correctly, as that's unnecessary duplication of work.

Categories