How can I parse consumed web-service results into a database table? - php

I'm using the National Weather Service's REST web service to return forecast data for locations. I can get the data back and display it onscreen by calling an XSLT, and I can use XSLT to transform the returned XML to a file of SQL insert statements that I can then import manually. However, I'm a bit confused as to how I can do this automatically.
Some clarification: I need to have a cron job run on a scheduled basis to pull in data from the web service. I then need to somehow parse that data out into database records. This all needs to happen automatically, via the single php file that I'm allowed to call in the cron job.
Can anyone give me an idea of how I'd go about this? Do I need to save the XML response to an actual file on my server, and then transform that file into a sql file, and then somehow (again automatically) run an import on the SQL file? Ideally, I wouldn't have to save anything; I'd just be able to do a direct insert via a database connection in my php file. Would that work if I looped through the response XML using a DOM parser rather than an XSLt file?
I'm open to any alternative; I've never done this before, have no idea of how to go about it, and have been unable to find any kind of articles or tutorials about parsing XML data into a database directly.

You need to parse the xml data instead of using xslt to transform it.
You can use xml_parse_into_struct to turn it into a php array and work from it there.
It is probably easier to use SimpleXml and walk the xml tree though.
Considering you already have an xslt transformation, you can also write out the sql to a file, and pipe it to mysql directly.
exec("echo xml_sql.txt| mysql -uusername -ppassword database")
Good Luck!

Related

Importing a JSON file into a database

I have generated a JSON file and it is formatted correctly, the same way as I have export it from my MySQL database. Now I need to put it back:-)
I would like to read up on how to do this, so if there is a good link, I welcome it. I have used the phrase
php - script to import JSON file into MySQL database
in Google and others like it, but I having no luck.
Also if I import a file and the record already exists, how do I deal with duplicate issues? Can I make it so that it overwrites automatically?
I am uploading it from an iOS app, so I do not see the php file at work.
You can use json_decode
json_decode($json_from_ios_app, true)
json_decode will return associated array. Based from this structure, construct SQL queries.

PHP & XML, What Is Recommended?

Background:
So i'm writing a web-service style web application as a way to increase my knowledge of how PHP and XML work together. I want to eventually take that XML data and use it in a mobile phone application but that's a different issue. I can connect to the data, pull, and process all the information with PHP and I've managed to get it exporting to CSV. I want to now begin to push that data out in XML.
Question:
What is the (a) recommended way to work with XML in PHP?
References:
PHP Manual, XML Portion
I suggest using simple XML which is way easier to handle xml operations.

Can PHP be used inside an XML file?

I am trying to generate a RSS feed from a mysql database I already have. Can I use PHP in the XML file that is to be sent to the user so that it generates the content upon request? Or should I use cron on the PHP file and generate an xml file? Or should I add the execution of the php file that generates the xml upon submitting the content that is to be used in the RSS? What do you think is the best practice?
All three approaches are technically possible. However, I would not use cron, because it delays the update process of your XML-files after the database content has changed.
You can easily embed PHP-Code in your XML-files, you just have to make sure that the files are interpreted as PHP on the serverside, either by renaming them with a *.php extension or by changing the server directives in the .htaccess-file.
But I think that the best practice here is to generate new XML-files upon updating the database contents. I guess that the XML-files are viewed more often than the database content changes, so this approach reduces the server load.
Use a cron to automate a PHP script that builds the XML file. You can even automate the mail part as well in your PHP.
The third method you mentioned. I don't understand how cron can be used here, if there are data coming in users' request. The first method cannot be implemented.
Set the Content-type header to text/xml and have your PHP script generate XML just as it would generate any other content. You may want to consider using caching though, so you don't overwhelm the server by accident.

Querying large XML file (600mb+) in PHP or JavaScript?

I have a large XML file (600mb+) and am developing a PHP application which needs to query this file.
My initial approach was to extract all the data from the file and insert it into a MySQL database - then query it that way. The only issue with this was that it was still slow, plus the XML data gets updated regularly - meaning I need to download, parse and insert data from the XML file into the database everytime the XML file is updated.
Is it actually possible to query a 600mb file? (for example, searching for records where TITLE="something here"?) Is it possible to get it to do this in a reasonable amount of time?
Ideally would like to do this in PHP, though I could also use JavaScript too.
Any help and suggestions appreciated :)
Constructing an XML DOM for a 600+ Mb document is definitely a way to fail. What you need is SAX-based API. SAX, though, does not usually allow XPath to be used, but you can emulate it with imperative code.
As for the file being updated, is it possible to retrieve only differences anyhow? That would massively speed up subsequent processing.

How do I save an html page to a database?

I just want to know if it's possible. How do I save an html page and all of its contents to a database? like for example, I have a database wherein it consists all HTML pages.
Just want to know if its possible. And how to retrieve it too. We're using PHP as our language.
Thank you.
Well, you'll need to:
Grab that page by using a HTTP request, just like your browser does
Parse that HTML to find external resources (script, img, object, etc)
Grab those external resources
Save all them on your database into a BLOB field
Optionally alter your original HTML document, to change that resources location
Is this what you are trying to do? http://www.phpfreaks.com/forums/index.php?topic=219271.0
You can simply store the $out in the db instead of saving as html
Assuming MySQL, here is the way to connect to the database and write data into it.
PHP and MySQL
HTML is just text. You can tore it in a database in a TEXT field.
There are plenty of DBMS you can use and plenty of ways to do it.
You can have a look at the PDO extension to directly consume a MySQL or SQlite connection for instance.
You can also use an ORM like Doctrine
If you are trying to save the final results of your PHP script (ie. what is sent to the browser) you will need to look into Output Buffering.
As others already suggested, yes its possible to save HTML pages inside databases like mysql or sqlite etc. Another way you can perceive "databases" is flat files. Therefore, just like web crawlers or tools like wget/curl that crawls(and download) pages to disk, you can program something like that in PHP (using libraries such as curl) and save those pages to your disk. How to retrieve?? just display them with web browser OR do normal opening of file , display the contents and closing the file, all with PHP.

Categories