I'm using PHP xmlreader to validate and parse xml data. This xml is validated with some xsd schema from local file via XMLReader::setSchema function and remote xsd schema from http:// via xsd:import/include. Eveything work fine, but it fetch xsd schema from net and read from disk everytime when called.
So my questions is:
Is there a method for caching remote xsd schema in local RAM?
For local schema files, I think tmpfs in Linux will work fine, but is there another way to cache local xsd schema files ?
Solution
Thank VolkerK for pointing out the xmlcatalog system. It work fine with libxml/php xmlreader. In Linux, just edit file /etc/xml/catalog (It come from xml-common when you are in Fedora) add some entries like (for example):
<rewriteURI uriStartString="http://schemas.xmlsoap.org/soap/envelope/" rewritePrefix="/etc/xml/SOAP-Envolope.xsd"/>
<rewriteURI uriStartString="http://schemas.xmlsoap.org/soap/encoding/" rewritePrefix="/etc/xml/SOAP-Encoding.xsd"/>
and manual download schema (e.g http://schemas.xmlsoap.org/soap/encoding/ -> /etc/xml/SOAP-Encoding.xsd) then php xmlreader work like expected when parsing SOAP Messages.
php's xmlreader uses libxml and libxml supports xml catalouges:
What is a catalog? Basically it's a lookup mechanism [...]
It is basically used for 3 things:
[...]
providing a local cache mechanism allowing to load the entities associated to public identifiers or remote resources, this is a really important feature for any significant deployment of XML or SGML since it allows to avoid the aleas and delays associated to fetching remote resources.
Haven't tried it but I guess it's worth a test run.
Related
I don't know much about Delphi / ClientDataSets but I'm willing to look into it. I have a question before I pursue it though, to determine if what I want to achieve is feasible.
I want to use a PHP script to save a dozen subsets of my MYSQL database to CDS files once weekly. Is there a File specification that I can follow to create a CDS file? I'll be running the script on a shared web host using Linux, so I don't think running Delphi scripts on the server is viable.
Thanks!
There is a related question on Stackoverflow which includes a partial XSD:
Anyone that has a partial XSD that describes the METADATA section of Delphi TClientDataSet XML files?
You can use this XSD and an XML library to create XML files from your data which are compatible with TCLientDataSet, so they can be opened in a Delphi application.
I don't know PHP XML libraries, but in many languages XML libraries are able to create mapping code based on the XSD, which then can be used to read and write XML files based on the schema definition.
I'm currently developing a Web application with Zend Framework and PHP 5.3. I have a XML file that contain configs and mapping information (+- 1500 lines). On each request I perform an xpath query to get information from that XML file. The information that is found in the XML file is static and do not change after the deployment of the application.
It is a good practice to generated a php file that contain the XML information in a static arrays on the first request and then load that php file on every request to get the information instead of doing queries on the XML?
You can cache the parsed config file as source file with var_export.
Generating code to cache resources is implemented in several places in Zend Framework, for example autoloader, so I presume it is good practice.
There is also another way to cache it - with serialize (make sure to serialize an array, not for example a SimpleXML object) or Zend_Cache which does more or less the same but is more flexible as to how the result is stored.
Since the XML not changes after deploy, i think it would be the best to transform that XML in your local dev env, and not on the productive system when needed. Its not a good idea to generate source on the productive system that will be automatically included without any validation.
I'm not very familiar with XSLT, but it might be an option for you, according to the concrete structure of that XML.
I currently have a php file that must read hundreds of XML files, I have no choice on how these XML files are constructed, they are created by a third party.
The first xml file is a large amount of titles for the rest of the xml files, so I search the first xml file to get file names for the rest of the xml files.
I then read each xml file searching its values for a specific phrase.
This process is really slow. I'm talking 5 1/2 minute runtimes... Which is not acceptable for a website, customers wont stay on for that long.
Does anyone know a way which could speed my code up, to a maximum runtime of approx 30s.
Here is a pastebin of my code : http://pastebin.com/HXSSj0Jt
Thanks, sorry for the incomprehensible English...
Your main problem is you're trying to make hundreds of http downloads to perform the search. Unless you get rid of that restriction, it's only gonna go so fast.
If for some reason the files aren't cachable at all(unlikely), not even some of the time, you can pick up some speed by downloading in parallel. See the curl_multi_*() functions. Alternatively, use wget from the command line with xargs to download in parallel.
The above sounds crazy if you have any kinda of traffic though.
Most likely, the files can be cached for at least a short time. Look at the http headers and see what kind of freshness info their server sends. It might say how long until the file expires, in which case you can save it locally until then. Or, it might give a last modified or etag, in which case you can do conditional get requests, which should speed things up still.
I would probably set up a local squid cache and have php make these requests through squid. It'll take care of all the use the local copy if its fresh, or conditionally retrieve a new version logic for you.
If you still want more performance, you can transform cached files into a more suitable format(eg, stick the relevant data in a database). Or if you must stick with the xml format, you can do a string search on the file first, to test whether you should bother parsing that file as xml at all.
First of all if you have to deal with large xml files for each request to your service it is wise to download the xml's once, preprocess and cache them locally.
If you cannot preprocess and cache xml's and have to download them for each request (which I don't really believe is the case) you can try optimize by using XMLReader or some SAX event-based xml parser. The problem with SimpleXML is that it is using DOM underneath. DOM (as the letters stand for) creates document object model in your php process memory which takes a lot of time and eats tons of memory. I would risk to say that DOM is useless for parsing large XML files.
Whereas XMLReader will allow you to traverse the large XML node by node without barely eating any memory with the tradeoff that you cannot issue xpath queries or any other non-consequencial node access patterns.
How to use xmlreader you can consult with php manual for XMLReader extension
We run multiple Windows/IIS/.Net sites (up to 30+ sites per server). Each site is customized for the individual customer via a configuration file that contains the settings.
I am tasked with writing a small tool that will 'grep' all of the config files on a certain server for a particular config setting (or settings) and return the values for a nice tabled web page display. It will save many groups lots of time, especially since most groups don't have access to production servers, but they need to know how a customer is currently configured.
I have working code that finds all .config files from a starting path, I can easily extend this to do my grep'ing. Here are the challenges:
I want to aggregate this data from MULTIPLE servers. That means, the tool will be hosted on its own server -- and will make calls to a list of servers.
I'm limited to using .NET/ASP on the actual servers (they won't install PHP on IIS), but I'm writing the tool in PHP.
PROPOSED DESIGN: From my vantage point, I'm thinking the best way to accomplish this is to write my PHP tool and have it make AJAX or CURL requests to ASP scripts that live on each server in the list. Each ASP script could do the recursive directory parsing to find the config files and individually grep the files for the data, and return it in the RESPONSE.
Is that the best way to accomplish this? Should the ASP or PHP side do the 'heavy lifting'? Is their a recommended data format I should be using to pass the data.
Any ideas or samples would be great. If you need more info, I can provide!
Thanks!
Update: Here's an example of a config. Its a basic ASP file that gets included in other scripts.
custConfig1 = " 8,9,6:5:5 "
custConfig2 = " On "
I think you're bang on using PHP for the "receiving" script, and pretty sure you have that in hand.
Based on the format of your example config file, you could use ExecuteGlobal in classic ASP to load each file as you loop through them in your recursive directory lookup. Then you can use the custConfig1 et al. names in your script. e.g. (pseudo)
for each file
output("custConfig1") = custConfig1
next
Return what you need as JSON using a handy library and then do all the "hard" work of collating it and outputting it in PHP.
Yes, "grep" (if by that you mean importing a text file and using reg expressions to navigate it) isn't the best solution, in my humble opinion, use either JSON or XML as the format, and use PHP's built in XML or JSON tools.
JSON: http://php.net/manual/en/book.json.php
XML: http://php.net/manual/en/book.simplexml.php
You could use the DOM to navigate XML alternatively to SimpleXML, but SimpleXML is easier to learn (again, in my opinion) and will work for your needs.
How can I create a (empty) XML file trough a existing XSD schema?
Which PHP (5.3) functions are necessary?
I was also searching on how to "initialize" directly a DOM XML object that corresponds to an existing XSD in PHP. This allows for easily "feeding" this empty XML file with my own data.
Found only ONE method that does it, the SDO DAS XML extension of PHP
http://www.php.net/manual/en/sdo-das-xml.examples.php
see example #2. Unfortunately the extension is not by default in PHP 5.3, you have to add it through PECL, which does't work on my Windows PC, so I could not test it.
You would need to write a PHP parser that takes the XSD file and builds, tag by tag, an XML file.
Depending on the complexity of your XSD schema, this could either take a short/medium amount of time - or it could be entirely unfeasible, in terms of "you should probably just make an XML file by hand".