php process image files created with complicated url syntax - php

I have a script on my server that dynamically creates images of chess diagrams:
<img src = "ChessImager/ChessImager.php?fen=r3k2r/1pqb2pp/pnn2p2/4p3/4Q3/NBP1B3/P4PPP/R2R2K1&square_size=45&ds_color=(143,188,143)&ls_color=(232,223,192)">
But the resulting image files are nearly 30k, too big. I want to use pngnq http://pngnq.sourceforge.net/ to shrink them. I present them in a slideshow at http://communitychessclub.com I want a new php script to create the images from ChessImager.php and pipe each of these diagram image files (~50) to a new filename like 'game1234.png' and I'll batch pre-process (not real-time) them with pngnq. I have a file 'Forsyth.csv' which lists the data:
r1bqk2r/1p2bp1p/p2pnp2/4pN1Q/2B1P3/2N5/PP3PPP/R2R2K1|1256
r3k2r/1pqb2pp/pnn2p2/4p3/4Q3/NBP1B3/P4PPP/R2R2K1|1255
4rrk1/ppp3pp/2n4q/3p4/3P4/1NP1PpPP/PP3Q1K/R4R2|1253
rn2kb1r/1q1p2p1/p3p3/1p2N1Bp/2p1P2P/2P4Q/PP3PP1/3R1RK1|1252
I use this:
<?php $text = file('Forsyth.csv');foreach($text as $line)
{$token = explode("|", $line); print "\n"; $fen = $token[0]; $game_num = $token[1];
$phrase="games/game$game_num.php"; echo "<li> <img
src=\"ChessImager/ChessImager.php?fen=$fen&square_size=45&ds_color=(143,188,143)&ls_color=(232,223,192)\" ></li>";} ?>
Any ideas?
Update: this is posted at http://communitychessclub.com/produce.php
<?php $text = file('Forsyth.csv'); foreach($text as $line) {$token = explode("|",
$line); print "\n"; $fen = $token[0]; $game_num = $token[1]; print "\n";
$goat = "diagrams/game$game_num.png";
$src="ChessImager/ChessImager.php?fen=$fen&square_size=45&ds_color=(143,188,143)&ls_color=(232,223,192)";
echo "<li><img src = \"$src\"></li>";}
?>
Any ideas?

ChesssImager.php or one of its includes must have a imagepng($image) line near the end that sends the generated PNG image to the web browser. If your question is how to save that data to the disk instead, you can just modify the script so that it saves the image data instead:
imagepng($image, $filename);
where $filename is something unique that you can generate from the arguments passed to the script. For example:
$filename = md5($fen).".png";
Wherever you decide to have the script save the files, you'll need to make sure that you (or the web server if you're running it in a browser) has permission to write to that folder.

Related

how to add the logo into csv file using ECSVExport in yii?

Yii::import('application.extensions..ECSVExport');
$filename = 'filename.csv';
$csv = new ECSVExport($sheet_generation);
$csv->setOutputFile($outputFile);
$imageUrl = Yii::app()->request->baseUrl.'/themes/optisol/images/resign-icon.png';
$num = cal_days_in_month(CAL_GREGORIAN, $month,$year);
$heading="Attendance for 01"."-".$month."-".$year." To ".$num."-".$month."-".$year." ";
$content=$heading;
$content = $content.$csv->toCSV();
Yii::app()->getRequest()->sendFile($filename, $content, "text/csv", false);
It cannot be done. Not in the way you asked, actually.
A CSV is a text file, and therefore cannot have embedded images like a word processor document can.
If we want the image in the file, you will have to put the file name in the document. Then, the person reading the document can decide what to do with the file name.
A file will look this:
id, name, image, email
1,'Tom', 'image1.png', 'user1#domain.com'
1,'Jones', 'image2.png', 'user2#domain.com'

Copy Multiple Images From Remote Server

I am trying to copy multiple images from a remote host (using a URL) to my local box (using XAMPP on my local box to execute the script).
I am using copy(). When I go to execute the copy(), only the LAST image in the array is created. So, if I have 5 image links, only the 5th image gets created and nothing prior even gets a file created.
I have tried CURL and FOpen and then both create all of the files, but all of the files are blank except, again, the last file which is perfectly fine.
$txt_file = file_get_contents('urls_for_images.txt');
if(!empty($txt_file)){
$image_links = explode("\n", $txt_file);
$i = 1;
foreach($image_links as $image_link){
$file_info = pathinfo($image_link);
copy($image_link, 'images/00' . $i . '_original.' . $file_info['extension']);
$i++;
}
}
I am not sure where the problem is occurring, but it seems odd to me that it will copy the last image in the text file, but not any of the others.
Thanks for the help in advance!
The i variable never changes, therefore the code tries to copy a file with the same name over and over again and only the last file is saved.
Try modifying your code this way:
$txt_file = file_get_contents('urls_for_images.txt');
if(!empty($txt_file)){
$image_links = explode("\n", $txt_file);
$i = 1;
foreach($image_links as $image_link){
$file_info = pathinfo($image_link);
copy($image_link, 'images/00' . $i . '_original.' . $file_info['extension']);
$i++;
}
}
You'd be better off with just file(), which reads the file into an array automatically:
$files = file('urls_for_images.txt', FILE_IGNORE_NEW_LINES);
foreach($files as $remote_file) {
$local_file = ....;
copy($remote_file, $local_file);
}
It appears the problem is the text file which line endings are \r\n and you are exploding with only \n. The quickest way to fix this is either explode by \r\n; or trim with default parameters to remove \r from the end of each line.
foreach($image_links as $image_link){
$image_link = trim($image_link);
$file_info = pathinfo($image_link);
...
}
However, the cleanest way to do this is to use file function, which handles line endings automatically. I recommend you to use this approach.

php scraper scripts need to be changed

this script harvests links out of a seed url and only prints them in command shell (or browser) rather than saving elsewhere. I want the script to store any outputs in .txt file within the folder where the script resides. I need suggestions what could be the efficient way to do that. Please give me hints.
<?php
# Initialization
include("LIB_http.php"); // http library
include("LIB_parse.php"); // parse library
include("LIB_resolve_addresses.php"); // address resolution library
include("LIB_exclusion_list.php"); // list of excluded keywords
include("LIB_simple_spider.php"); // spider routines used by this app.
set_time_limit(3600); // Don't let PHP timeout
$SEED_URL = "http://www.schrenk.com"; // First URL spider downloads
$MAX_PENETRATION = 1; // Set spider penetration depth
$FETCH_DELAY = 1; // Wait one second between page fetches
$ALLOW_OFFISTE = false; // Don't allow spider to roam from the SEED_URL's domain
$spider_array = array();
# Get links from $SEED_URL
echo "Harvesting Seed URL \n";
$temp_link_array = harvest_links($SEED_URL);
$spider_array = archive_links($spider_array, 0, $temp_link_array);
# Spider links in remaining penetration levels
for($penetration_level=1; $penetration_level<=$MAX_PENETRATION; $penetration_level++)
{
$previous_level = $penetration_level - 1;
for($xx=0; $xx<count($spider_array[$previous_level]); $xx++)
{
unset($temp_link_array);
$temp_link_array = harvest_links($spider_array[$previous_level][$xx]);
echo "Level=$penetration_level, xx=$xx of ".count($spider_array[$previous_level])." <br>\n";
$spider_array = archive_links($spider_array, $penetration_level, $temp_link_array);
}
}
?>
Use file_put_contents PHP function with enable append file flag.
$file = 'file_name.txt';
file_put_contents($file, $text_to_write_to_file, FILE_APPEND);
Ref: http://www.php.net/manual/en/function.file-put-contents.php
I would recommend first creating a variable to store the output in the script. So at the top (under the $spider_array=array() ) add:
$output = "";
The change all the lines with echo to be $output .=
This will store all the content sent to the screen or the browser into the $output variable.
Now at the bottom of the script, after everything has been scraped and the spider is finished, save the output to a file:
$filename = date('Y_m_d_H_i_s') . '.txt';
$filepath = dirname(__FILE__);
file_put_contents($filepath . '/' . $filename, $output);
This should save the output in a file within the same folder as the script with a date/time file name. (This code was written using examples from php.net, exact implementation may need a bit of debugging, but this should get you close enough.

Download images from specific array of urls with php

I need to download some images from a specific array of urls, something like:
<?php
$images = array('http://url1.com/img1.png','http://url1.com/img2.png');
// download this images from this paths
I saw some scripts here but don't get them exactly on how can i link them to this array.
The expected output would be: When i run the script to download those images from that array to a specific folder from my server: home/user/public_html/images. It's much appreciated anyone who helps me, i am trying but can't make the connections, currently a rookie.
Something like this maybe.
$images = array('http://ecx.images-amazon.com/images/I/214RgVjsvTL.jpg','http://ecx.images-amazon.com/images/I/515pMJlul8L.jpg');
foreach($images as $name=>$image) {
//get image
$imageData = file_get_contents($image); //$image variable is the url from your array
$name = explode("/", $image);
$handle = fopen("images/".$name[5],"x+");
fwrite($handle,$imageData);
fclose($handle);
}

Editing PHP using PHP (admin center)

Am developing an admin center where I can edit configuration files (written in PHP). I do NOT want to store these values in a mySQL table (for various reasons). So say my config.php has contents like:
<?php
$option1 = 1;
$option2 = 2;
$option4 = 5;
$option7 = array('test','a','b',c');
?>
Now say in one of the admin pages I will only be changing a few values like option2 or option4 etc. Any ideas on what would be the best way to go about this.
I know one option is to read the PHP file completely and write parts of it using REGEX. Any way to make this more efficent? I don't want the config.php file to break because of some error on the user's end. Any ideas on how to ensure that it works?
If you have some liberty about the way you store configuration values, you may use ini files.
All you have to do is load the content of the ini file in an array with parse_ini_file, then modify values in that array and finally overwrite the file with new values, as described in this comment.
For obvious security reasons it's a good idea to place those files out of your document root.
sample content of ini file :
[first_section]
one = 1
five = 5
animal = BIRD
[second_section]
path = "/usr/local/bin"
URL = "http://www.example.com/~username"
sample code (using safefilewrite function) :
<?php
$ini_file = '/path/to/file.ini';
$ini_array = parse_ini_file($ini_file);
$ini_array['animal'] = 'CAT';
safefilerewrite($file, implode("\r\n", $ini_array));
?>
var_export() is probably the function you're looking for.
You can write/read the settings to a file using the following code:
$content = array();
//fill your array with settings;
$fh = fopen ( $bashfile, 'w' ) or die ( "can't open file" );
fwrite ( $fh, $content );
fclose ( $fh );
to read it you use:
file_get_contents() //this will return a string value
OR
Line by line:
$lines = file('file.txt');
//loop through our array, show HTML source as HTML source; and line numbers too.
foreach ($lines as $line_num => $line) {
print "Line #<b>{$line_num}</b> : " . htmlspecialchars($line) . "<br />\n";
}

Categories