building a 'interface' for .php upload api - php

i have a .php file from a videohoster who says that this code is a comandline code which allows me to upload a video to their site and i need to create an interface for a random user that he can upload a video to the videohoster while staying on my website. i thought i simple form like below in html is enought but apparently it doesent work:
front-end, html:
<h2>This form allows you to upload a Video.</h2>
<form action="uploadapi.php" method="post" enctype="multipart/form-data"><br>
<p>Video Name: <input type="text" name="titel" size="50" /></p>
<p>Video Description:<br/><textarea name="text" rows="5" cols="50"> </textarea></p>
<p>Select File, allowed: .mpg </br><input type="file" name="file"></p>
<p><input type="submit" value="Upload"</p>
</form>
the uploadapi.php is supplied by the hoster and so i assume it is correct.
<?php
////////////////////////////////////////////////////////
// for php 5.6+ you need to make some changes in code
// method 1
// add the following line
// curl_setopt($ch, CURLOPT_SAFE_UPLOAD, 0);
//
// method 2
// change
// $post_fields['vfile'] = "#".$file;
// to
// $post_fields['vfile'] = CURLFile($file);
////////////////////////////////////////////////////////
$apiversion = "2.123.20150426";
//REQUIRED Registered Users - You can find your user token in API page.
$user_token = "xxx";
if(count($argv) < 2)
die("Usage: php $argv[0] [VIDEO TO UPLOAD] {SUB FILE}\n");
$file = $argv[1];
if(!file_exists("$file"))
die("ERROR: Can't find '$file'!\n");
$path_parts = pathinfo($file);
$ext = $path_parts['extension'];
$allowed = array("mov");
if (!in_array(strtolower($ext),$allowed))
die("ERROR: Video format not permitted. Formats allowed: .mov!\n");
if(isset($argv[2]))
{
$sub_file = $argv[2];
if(!file_exists("$sub_file"))
die("ERROR: Can't find '$file'!\n");
$path_parts = pathinfo($sub_file);
$ext = $path_parts['extension'];
$allowed = array("srt");
if (!in_array(strtolower($ext),$allowed))
die("ERROR: Subtitle format not permitted. Formats allowed: .srt!\n");
$post_fields['subfile'] = "#".$sub_file;
}
$converter = file_get_contents("http://.../getconv_uploadapi.php? upload_hash=".$user_token);
if($converter=="ERROR")
die("ERROR: Could not choose converter. Aborting... \n");
$post_fields['vfile'] = CURLFile($file);
$post_fields['upload'] = "1";
$post_fields ['token'] = 'xxx';
if(!empty($user_token))
$post_fields['upload_hash'] = $user_token;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$converter);
curl_setopt($ch, CURLOPT_POST,1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $post_fields);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$result=curl_exec ($ch);
curl_close ($ch);
echo "$result\n";
?>
is my methodology correct (using html form to call the uploadapi.php on my server) or do i need in order to sumit my video to the videohoster via uploadapi.php other programming languages (ajax, javascript etc)?

Provided file is command-line file. It means it won't work on web as expected.
For example, on web you won't have $argv[1] or $argv[2].
You have two options:
you can rewrite uploadapi.php considering that source file is a file that came from your form
or upload your file and call uploadapi.php as a command-line script with arguments using shell_exec or exec, for example.

i thought that i define or fill the variable $argv[1] from the html form since: type="file" name="file"> = $file = $argv[1];?
... ok rewriting upload api is no alternative since it comes from the hoster.
so what i need to do is uploading the uploadapi.php on my server and then keeping my html form like it is? and insted of action=uploadapi.php i need a action=somethingsomething.php which has an exec command to execute uploadapi.php?:
<?php
function somethingsomething($uploadapi) {
exec($uploadapi . " > /dev/null &");
}
?>

Related

Multiple request in curl function

i am just transloading multiple files from one server to another, if it comes less then this script work perfect, if its crossing to thousands URL's then requests increase and then server block for accessing files.
Here is script
<?php
// Check if form has been submitted
if(#$_POST['submit']){
ini_set("max_execution_time", 0); // no time-outs!
ignore_user_abort(true); // Continue downloading even after user closes the browser.
// URLS -- One on each line
$URL = $_POST['url'];
// Relative path to Save downloaded images
// Default is "downloads"
// Make sure that it is writable (chmoded correctly)
$folder = $_POST['folder'];
// Check if User has provided a local folder
if (!$folder || !isset($folder)){
// Generate error if left blank by user.
die ("Please specify local folder name");
}
// Split all URLS into an array
$urls = split("\n", $URL);
// Remove Carriage Returns (useful for Windows-based browsers)
$urls = str_replace("\r", "", $urls);
$mh = curl_multi_init();
foreach ($urls as $i => $url) {
$path = pathinfo($url);
$g=$folder . "/" . $path["basename"] ;
// Check if file already exists on local folder.
if(file_exists($g)){
// If exists, delete the file so it always contains the latest update.
unlink($g) or die("Unable to delete existing '$g'!");
}
// Update the user of what's going on
echo "$i) Downloading: from <b>$url</b> to <b>$g</b><br />";
if(!is_file($g)){
$conn[$i]=curl_init($url);
$fp[$i]=fopen ($g, "w");
curl_setopt ($conn[$i], CURLOPT_FILE, $fp[$i]);
curl_setopt ($conn[$i], CURLOPT_HEADER ,0);
// curl_setopt($conn[$i],CURLOPT_CONNECTTIMEOUT,1000);
curl_multi_add_handle ($mh,$conn[$i]);
}
}
do {
$n=curl_multi_exec($mh,$active);
}
while ($active);
foreach ($urls as $i => $url) {
curl_multi_remove_handle($mh,$conn[$i]);
curl_close($conn[$i]);
fclose ($fp[$i]);
}
curl_multi_close($mh);
} // task closed
?>
<br />
<br />
<fieldset>
<legend>
<label for="url">Server to Server Upload Script</label>
</legend>
<form method=POST>
<label for="url">Insert Files URL, One Per Line: </label><br />
<textarea rows=15 cols=75 id="url" name="url"><?= $URL ?></textarea><br />
<label for="folder">Folder Name: </label><input type=text id="folder" name="folder" value="uploads"/>
<input type=submit name="submit" value="Start Uploading Files!" />
</form>
</fieldset>
With this script if i add 5000 URL's it suddenly starts sending request and transloading the files. (Server block it due to huge requests.)
How do i add code once the first file transloaded then the other
request should be sent. ?

Move Upload File from PHP to CGI

Im trying to pass post file information to an upload.php file and have that information be sent to a CGI script. There is nothing on the net on how to go about doing this that i can find, iv spent days. I know there are a few people out there that need this, it could help us all that have legacy perl scripts.
Dataflow:
Jquery --> Upload.php --> index.cgi
My php:
<?php
if(isset($_FILES['file'])) {
if(move_uploaded_file($_FILES['file']['tmp_name'], "../index.cgi" . $_FILES['file']['name'])){
echo "success";
exit;
}
}
?>
Post call to CGI example:
foobar.com/index.cgi?act=store&data=$filename
Any suggestions would help greatly. Thank you.
From my understanding, your CGI script is receiving a parameter which is the path of the uploaded script. However you are attempting to pass the uploaded script to your CGI script using a function that is only supposed to move a file from 1 place to another without executing a script.
my suggestion is to do the following
<?php
if(isset($_FILES['file'])) {
$destination = "new/path/to/".$_FILES['file']['name'];
if(move_uploaded_file($_FILES['file']['tmp_name'], $destination)){
$data = array();
//You can add multiple post parameters here
//$data = array('param1' => 'value1', 'param2' => 'value2');
$url = "http://url/to/hello.cgi";
// You can POST a file by prefixing with an # (for <input type="file"> fields)
$data['file'] = '#'.$destination;
$handle = curl_init($url);
curl_setopt($handle, CURLOPT_POST, true);
curl_setopt($handle, CURLOPT_POSTFIELDS, $data);
$result = curl_exec($handle);
if($result) {
echo "success";
}
exit;
}
}
?>
You can execute the cgi script via a CURL POST and pass any params you want

How to download directly from one website to another

I have a piece of a PHP code:
<?php
$image_cdn = "http://ddragon.leagueoflegends.com/cdn/4.2.6/img/champion/";
$championsJson = "http://ddragon.leagueoflegends.com/cdn/4.2.6/data/en_GB/champion.json";
$ch = curl_init();`
$timeout = 0;
curl_setopt ($ch, CURLOPT_URL, $championsJson);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
$json = curl_exec($ch);
curl_close($ch);
$json_array = json_decode($json, true);
$champions = $json_array["data"];
foreach ($champions as $championdata) {
$image_url = $image_cdn.$championdata["image"]["full"];
$image = file_get_contents($image_url);
file_put_contents("imgfolder/".$championdata["image"]["full"], $image);
}
?>
So the idea of the code is to basically to decode a JSON and to download the images from the following website:
http://gameinfo.na.leagueoflegends.com/en/game-info/champions/
The pictures download perfectly fine and are stored into a folder I have created on my hard drive. The next step is, is it possible for me to use this same piece of code, to download/display the images onto a website I have recently created:
www.lolguides4you.co.uk
the website is basically a day old and I have literally been messing around with some HTML coding. I am new to both HTML and PHP so if someone could point me into the right direction, that would be great!
Assuming that all you want to do it insert the images into a page on your website, then this is quite simple.
However, it may be illegal for you to use an automated scraping tool to save/replicate/duplicate any of the images. Look into that before doing anything on the internet.
The first step is to upload your previous PHP script to your website. That way, rather than downloading the images to your computer and then trying to upload them to your site it allows you to just save the files straight into a directory on the website.
Next, you can create a basic PHP page anywhere on your site:
<?php
echo "Images downloaded:\n";
?>
Next you can use PHP's scandir() function to find every file in the download directory:
<?php
echo "Images downloaded:\n";
$files = scandir('imgfolder/');
foreach($files as $file) {
// Do something
}
?>
Finally, now you just show each image:
<?php
echo "Images downloaded:\n";
$files = scandir('imgfolder/');
foreach($files as $file) {
echo $file . "\n";
}
?>
You could also use Glob, which allows you to ignore any files which aren't a certain type (in case you have other files in the same directory:
<?php
echo "Images downloaded:\n";
$files = glob("imgfolder/*.jpg");
foreach($files as $file){
echo $file . "\n";
}
?>

Checking whether a pdf file is present on the url?

<?php
set_time_limit(0);
$url = 'http://www.some.url/file.pdf';
$path = 'files/file.pdf';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$data = curl_exec($ch);
curl_close($ch);
file_put_contents($path, $data);
?>
This is the code that I use to download a particular pdf file froma given url.
What if there a many files in that url by names file1.pdf, file2.pdf etc. How can I check while running a loop, when to end the loop as the files will be present up to a limited number ?
Please help!
Checking for 404 code:
$httpCode = curl_getinfo($handle, CURLINFO_HTTP_CODE);
if($httpCode == 404) {
/* file NOT found */
}
Checking mime type:
$mimeType = curl_getinfo($ch, CURLINFO_CONTENT_TYPE);
if($mimeType == 'application/pdf') {
/* It IS pdf file */
}
But note, that mime type can be other, but it'll be still PDF file! Also, check mime type by your PDF files: echo them to understand what u must look for. I'm not rly sure, that code in if statement is right (it is example only)
You can call curl_get_info() right after curl_exec().
pass image link inside file_get_contents();
then check using preg_match();
<?php $link = $image->img1;
$filecontent=file_get_contents($link);
if(preg_match("/^%PDF-1.5/", $filecontent)){
echo "Valid pdf";
}else{
echo "In Valid pdf";
}
?>
You can also check: to get last three character of your file
<?php
$img_type = substr($image->img1, -3); ?>
<?php if(preg_match("/^%PDF-1.5/", $filecontent) || $img_type == 'pdf' ){ }?>

Creating download links for Amazon Kindle

I noticed that Kindle recognizes the file types by their extension. So, I wrote a small script on my website that downloads a file, adds ".azw" extension and provide a download link. It works well sometimes but...
The problem is that I'm not a PHP developer and I'm sure the script is not written in the best way. A problem is that the script doesn't download some files (exe) but loads another exe. It gives an error saying that it cannot find the file and creates a filename.exe.azw file with 0 length. This issue appears only on the Kindle, but on the PC it's ok.
Also it seems that it can download only files smaller than 9 Mb.
The code is here:
<?php
if(!isset($_POST['link'])) {
?>
<form method="post" action="file.php">
<p>
Enter link: http://
<input type="text" name="link" size="20"/>
<input type="submit" name="submit" value="submit" />
</p>
</form>
<?php
} else {
$remote = 'http://' . $_POST['link'];
$download = download($remote);
echo 'Download: ' . $download . '';
}
function download($url) {
$lastDashPos = strrpos($url, '/');
$local = substr($url, $lastDashPos + 1, strlen($url) - $lastDashPos) . '.azw';
$handle = file_get_contents($url);
file_put_contents($local, $handle);
return $local;
}
?>
The script is at http://forexsb.com/test/file.php if you want to test it.

Categories