SimpleXMLElement - string could not be parsed as XML - php

I have a problem loading an external XML file.
When I open it in a browser, everything looks good. I tried to download the XML file and upload it on my own server. When I try to load the XML file from my server, everything works well.
Can somebody help me solve this problem, so that I can load it from the external server?
My code:
$oXML_cz = new SimpleXMLElement(file_get_contents('http://www.ticketportal.cz/xml/temp/partnerall1.xml?ID_partner=122'));
foreach ($oXML_cz->event as $event_cz)
{
......
}

This library's errors are not too well documented. The problem MAY be due to an excessively large XML file rather than xml structural compliance / integrity.
For example, once parsed, 15MB files may extrapolate 1GB, so ini_set('memory_limit', '1024M'); may not be effective.
In above situation, i solved the problem by including the LIBXML_PARSEHUGE parameter during xml declaration / loading.
$xml = new SimpleXMLElement($contents, LIBXML_PARSEHUGE);

The solution is to try CURL:
function download_page($path){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$path);
curl_setopt($ch, CURLOPT_FAILONERROR,1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION,1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT, 15);
$retValue = curl_exec($ch);
curl_close($ch);
return $retValue;
}
$sXML = download_page('http://www.domain.com/file.xml');
$oXML_cz = new SimpleXMLElement($sXML);
foreach($oXML_cz->event as $event_cz)
{
...
}
Thank you for answers ;)

Check the configuration of allow_url_fopen. More tips can be found in this question.

Related

Playing .m3u8 video using php curl

I'm attempting to play a .m3u8 video using a php curl proxy. The following code seems to work, although it only returns the video #EXTM3U information but does not play the video.
Code:
<?php
//....proxy info
$auth = 'username:password';
$proxy_ip = '1.2.3.4.5';
$proxy_port = 8080;
$path = $_GET['link'];
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $path);
//curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_PROXYPORT, $proxy_port);
curl_setopt($ch, CURLOPT_PROXYTYPE, 'HTTP');
curl_setopt($ch, CURLOPT_PROXY, $proxy_ip);
curl_setopt($ch, CURLOPT_PROXYUSERPWD, $auth);
curl_exec($ch);
if (curl_error($ch)) {
$error_msg = curl_error($ch);
echo $error_msg;
}
curl_close($ch);
if (isset($error_msg)) {
echo $error_msg;
}
?>
Output:
#EXTM3U #EXT-X-VERSION:3 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-STREAM-INF:BANDWIDTH=5640800,AVERAGE-BANDWIDTH=5640800,CODECS="avc1.4d4028,mp4a.40.2",RESOLUTION=1920x1080,FRAME-RATE=25.000 index4147.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=2421100,AVERAGE-BANDWIDTH=2421100,CODECS="avc1.4d401f,mp4a.40.2",RESOLUTION=1280x720,FRAME-RATE=25.000 index2073.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1566400,AVERAGE-BANDWIDTH=1566400,CODECS="avc1.4d401f,mp4a.40.2",RESOLUTION=960x540,FRAME-RATE=25.000 index1296.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1002100,AVERAGE-BANDWIDTH=1002100,CODECS="avc1.77.30,mp4a.40.2",RESOLUTION=746x420,FRAME-RATE=25.000 index783.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=774400,AVERAGE-BANDWIDTH=774400,CODECS="avc1.77.30,mp4a.40.2",RESOLUTION=640x360,FRAME-RATE=25.000 index576.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=421300,AVERAGE-BANDWIDTH=421300,CODECS="avc1.42c015,mp4a.40.2",RESOLUTION=426x240,FRAME-RATE=25.000 index255.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=476300,AVERAGE-BANDWIDTH=476300,CODECS="avc1.42c01f,mp4a.40.2",RESOLUTION=640x360,FRAME-RATE=25.000 index101.m3u8
Any ideas regarding how I can play the video?
Facing the same issue...
Had any luck figuring it out?
I was also facing the same issue, until just a few moments ago.
Was able to generate a m3u8 file in php and set the mime type correctly but videojs and vlc just wouldn't play it...
If I downloaded the output, it would be good for vlc or videojs.
Followed a tip from Markus AO on How to create dynamic m3u8 using php and it worked.
Basically, it should be ignoring the mime type (although I haven't tried removing it) and just looks at the extension.
Just used the htaccess to make it so that when they call some m3u8's it actually calls my php file
Added this to my .htaccess:
RewriteRule ^proxy([^.]+).m3u8$ proxy.php
I'm actually sending the address as a GET parameter such as your $_GET['link']
The wildcard in front of proxy and before the .m3u8 extension is just because I've noticed that a plugin that I've installed in chrome would probably ignore that the parameter changed and just show the same stream as long as it was the proxy.m3u8 file.
Hoping this can be helpful to anybody

Check if a url offers a downloadable file?

I am currently working on some affiliate feeds where most are offered as raw .csv formats. I am using file_get_contents to generate .csv files along with fputcsv().
Unfortunately there is also a link between my affiliate url's that instantly downloads a csv file when you visit the url in the browser. This needs no further work since it's a perfect .csv file as is.
Since I just put my url's in a array I need to check for when a file is offered as a download link. How can I check for this so I can skip all my default .csv logic and not mess this file up?
I don't know what to search for since I don't know what exactly happens when a file straight up downloads instead of seeing raw csv data. Hopefully somebody can help me out.
You can check if a file is downloadable using CURL :
PHP
function checkDownloadable($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_NOBODY, 1);
curl_setopt($ch, CURLOPT_FAILONERROR, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
if(curl_exec($ch) !== FALSE) {
return true;
}
else {
return false;
}
}

How to read JSON file in PHP without using file_get_contents?

I've created a simple php file which reads a local json file and use the data to response to some queries sent to the page.
I'm using file_get_contents() to read the file. but as the requests count grown, I faced some performance issues because of concurrent connections to the file.
Is it possible to use require() or include() to read and parse the json file?
I've found some solution that fits, I'm now using curl to get the file contents like this:
$curlSession = curl_init();
curl_setopt($curlSession, CURLOPT_URL, 'http://mysite/file.json');
curl_setopt($curlSession, CURLOPT_BINARYTRANSFER, true);
curl_setopt($curlSession, CURLOPT_RETURNTRANSFER, true);
$jsonDataString = curl_exec($curlSession);
curl_close($curlSession);
Yes, you can use require()
ob_start();
ob_implicit_flush(false);
require('test.json');
$json = json_decode(ob_get_clean() , true );
but I think its not faster than file_get_contents()
The best way for performance is to load json-file to browser directly (e.g. by jquery) without php-parsing.

File download PHP script doesn't work because of server delay

I have this code to download a file, but on sourceforge.net sever there is a 5 seconds delay before file starts to download (You can see it if you try to load this link in browser). And I have file with zero size after script is done. How can I download this file? Thanx in advance!
$url = 'http://downloads.sourceforge.net/project/gnucash/gnucash%20%28stable%29/2.4.9/gnucash-2.4.9-setup.exe';
$ch = curl_init($url);
$fp = fopen('/home/content/11/8564211/html/'.substr($url,strrpos($url,'/'),strlen($url)), 'w');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
Sourceforge uses a meta refresh tag to start the download and because CURLOPT_FOLLOWLOCATION responds to Location: header it will most likely not help.
I think you're going to have to do som HTML parsing to achieve what you want to do. You have to find this line:
<meta http-equiv="refresh" content="5; url=http://downloads.sourceforge.net/project/gnucash/gnucash%20%28stable%29/2.4.9/gnucash-2.4.9-setup.exe?r=&ts=1333621946&use_mirror=switch">
Then you must get the url from the line and load that.
It's possible that Sourceforge uses some cookie or session based stopper for this kind of downloads so you may have to compensate for that.
I haven't tested this but it looks like this is close to the way you have to do this.
You can try this:
$url = 'http://downloads.sourceforge.net/project/gnucash/gnucash%20%28stable%29/2.4.9/gnucash-2.4.9-setup.exe';
$ch = curl_init($url);
$fp = fopen('/home/content/11/8564211/html/'.substr($url,strrpos($url,'/'),strlen($url)), 'w');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_exec($ch);
curl_close($ch);
fclose($fp);
To avoid timeouts in PHP you can use:
set_time_limit($hugetimeout);
before your script. You can read further documentation here.
On the download page, there's a direct link, you could try using that instead ?

PHP cURL sending and receive Images Client / Server

I have been researching this for a while and have not been find an answer for this.
I have a Client Site making calls to our API Server. What I would like to transfer an image to the Client Site when a special call is made.
I have some code that downloads the image from the server, but this is causing us to make multiple calls forcing us to create all these images in the server that we don't want to keep, even if we delete them afterward.
$originalFileUrl = createImage('createImage', $fileName);
downloadImage($originalFileUrl, $fileDestination);
deleteFileFromServer('deleteImage', $fileName);
function serverCall ($action, $fileName) {
$serverCall = $SERVER.'/api.php?fileName=' . $fileName . '&action=' . $action;
ob_start();
$ch = curl_init();
$timeout = 5;
curl_setopt ($ch, CURLOPT_URL, $serverCall);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 0);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
curl_exec($ch);
$fileContents = ob_get_contents();
curl_close($ch);
ob_end_clean();
return $fileContents;
}
function downloadImage ($originalFileUrl, $fileDestination) {
// Starting output buffering
ob_start();
// create a new CURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, $originalFileUrl);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
// set timeouts
set_time_limit(30); // set time in secods for PHP
curl_setopt($ch, CURLOPT_TIMEOUT, 30); // and also for CURL
// open a stream for writing
$outFile = fopen($fileDestination, 'wb');
curl_setopt($ch, CURLOPT_FILE, $outFile);
// grab file from URL
curl_exec($ch);
fclose($outFile);
// close CURL resource, and free up system resources
curl_close($ch);
ob_end_clean();
}
Where $originalFileUrl is the current location of the file, and $fileDestination is the path to where I want my new file to be.
My question is: Can I make a call to a PHP file in the Server that will be in charge of create, transfer and delete the image all in one call rather than doing multiple calls?
Also for multiple reasons ftp the file from the server to the client is not a good option.
Thank you
This will not be a trivial task. However, you should be able to design a successful approach. This won't be the most error-safe method of accomplishing the task, though. You're thinking right now of a HTTP-esque stateless protocol, which is manageable. If the description below doesn't sound good enough, consider another protocol which can maintain a constant bi-directional connection (like an SSH tunnel).
You'd likely suffer data overhead, but that would generally be more than acceptable in order to save multiple calls. To that end, I'd advise creating an XML interface. On the receiving end, your XML would have an element with either a Base64 representation of the image, or possibly a gzipped CDATA implementation. You don't have to stick to any XML standard, but if you do, the PHP XML Parser could help with some of the legwork.
So, to recap, in this model, the server end could receive a set of commands which do what you've called out: move the file into a processing folder, create a Base64 string of the file contents, craft the XMl package, and return it. The client will send a request, and process the response. If the client detects an error, it could retry and the server can still grab the file data from the processing queue.
If error becomes an issue and an open socket isn't a good option (because the coding is difficult), you could also develop a delete-batching system, where you track the files in the processing folder and only delete them on request. But, you'd only make delete requests from the client every once in a while, and possibly not as a part of any particular page with a user experience, but from a cron.

Categories