I have an url provided by a wholesaler. Url generates xml file which I need to save on my server.
I use PHP - file_get_contents and file_put_contents to do that:
$savepath = "path_to_my_server_folder";
$xmlurl = "http://usistema.eurodigital.lt/newxml/xmlfile.aspx?xml=labas&code=052048048048051057049050048049052";
file_put_contents($savepath.'eurodigital.xml', file_get_contents($xmlurl));
File is generated on my server, but its content is empty. I have no problems with other xml files if I provide direct xml url, but in this situation file is generated by aspx file dynamically. Xml url is actual url I use. When I open xmlurl in browser, xml file gets saved to device.
Can you help me with moving xml to my server? What function should I use? Is it even possible to do that using PHP 5? "allow_url_fopen" is ON.
Thank you in advance!
Related
I am at a situation, where I need to download files from the URL, it is easy with the direct file URLs like https://somedomain.com/some-path/somefile.exe
file_put_contents( $save_file_loc, file_get_contents($url_to_download);
But what to do when you have delayed force download from the URL which actually prints HTML and how to differentiate those URL?
Example URL: https://filehippo.com/download_mozilla-firefox-64/post_download/
EDIT: On above url the file download starts using JS, as I tested with blocking JS and download did not start.
Thanks in advance for your help.
Read the html of the URL using file_get_contents
Find the URL of the file within the HTML. You'll have to visit the page and view source to locate the URL. In your example of https://filehippo.com/download_mozilla-firefox-64/post_download/ it's found in between data-qa-download-url="https://dl5.filehippo.com/367/fb9/ef3863463463b174ae36c8bf09a90145/Firefox_Installer.exe?Expires=1594425587&Signature=18ab87cedcf3464363469231db54575665668c4f6&url=https://filehippo.com/download_mozilla-firefox-64/&Filename=Firefox_Installer.exe"
As you may have noticed, the page may have pre-approved the request so it's not guaranteed to work if the host has checks using cookies or other methods.
Create a regex based on the above to extract the URL using preg_match
Then file_get_contents the URL of the file to download it.
I am using a PHP script to generate FDF response and in that response the PDF template is supplied as a remote file "https://platweb-eur.qa.reachlocal.com/test/546_df50ab50418e1e925e81020815217649.pdf"
This is opening up the PDF template properly, but the fields are not getting populated.
While I am using local PDF template instead of file URL for the same pdf, fields's value were getting populated correctly.
So the issue is with remote file. Can anybody point me to what is going wrong here ?
Sorry if I did not get this title right.
I have in my server the redirect.php script that receives a URL passed by the client user, fetches the content using file_get_contents() function and them shows it to the user with echo() function.
The problem is when the user points directly to a PDF or JPG file in that URL and them the script shows the file contents as binary code.
When I set the code to recognize when the requested URL points directly to a downloadable file,
What should be the function or header to echo to the user so that his browser ask him to download the file insted of showing it?
Do I have to first put it into a file inside my server or can I do it directly from a command like file_get_contents()? If I can do that without writing it to my server, it would be a much better approuch.
I can't point directly to the server because some sites are blocked by my employee and the third party company that does this service thinks that StakExchenge sites are malicious and not constructive and were tegged as online communities like Facebook.
Try this:
$sourcefile = "http://www.myremotewebsite.com/myfile.jpg";
$destfile = "myfile.jpg";
copy($sourcefile, $destfile);
I need to load a gzipped xml from an external link but I don't know how to do it.
Now I'm loading this xml file to my server and then I change every 3 days the xml adress in the variable.
The code I'm using now for loading xml is this:
$xmlFile= new SimpleXMLElement('xml/abcd_201407281213_12074_28203833.xml', 0, true);
The link gived to me to download the xml file from their server is like this:
http://nameofthewebsite.com/folder/28203833C78648187.xml?ticket=BA57F0D9FF910FE4DB517F4EC1A275A2&gZipCompress=yes
Someone can help me please?
What Im trying to do is use PHP to scrape a website of a url I enter into a parameter.
I want the whole raw source code.. But thats not all..
I want it then saved into an html page, and onto the local server of the php script.
Is there a Easy Snippet for this? or can someone easily write me up a code?
For example
I want to scrape http://google.com
So for instance, mysite.com/scrape.php?url=http://google.com
I want it to save the front page of google into http://mysite.com/scraped/google.com.html
Here's a script that will save the contents of the specified url into a file named scraped.html:
if (isset($_GET['url'])):
$contents = file_get_contents($_GET['url']);
file_put_contents('scraped.html', $contents);
endif;
To use a url in the call to file_get_contents() you must enable allow_url_fopen in your php.ini file.
Of course this will only save the actual source of the requested url and not any other resources, such as images, scripts and stylesheets.