I'm thinking of using XML for storing users content on my web app. I am a php newbie and don't know much about how I would do this.
The content is private to the user, not shared public or for passwords or anything. So I was wondering how I can create and edit xml files on the backend part of the server, privately accessed by PHP like what it does with mySQL.
My questions are: 1) Is it possible 2) If so, how would I do it using php etc.
Put all xml files in a folder that is not accessible via the web, so outside of your document root. Alternatively, you can use .htaccess to restrict access to that folder.
For reading and writing those xml files from your directory, you can use simplexml. You don't need anything else despite pure php and some xml processing.
This should get you started ;).
Theoretically it is possible. by making your xml in php files format which output xml formatted tags after PHP authentication. The PHP files will be created dynamically using PHP filesystem functions and will be modified also, for each users in a specified folder(s) for your users.
Related
I know about java based web applications directory structure. A simple web application will have something like this:
web-application folder
WEB-INF folder
under WEB-INF you can have your lib, class and all.
anything which should be publicly available will be kept outside WEB-INF and which should not be directly accessible will be kept inside WEB-INF. If any file is present inside WEB-INF, it will be accessible only to java classes or jsp. User can not access the files which is present inside the WEB-INF directly through the browser. So, here WEB-INF works like a barrier, right.
Now, my question is, i am new to php and i want something like WEB-INF here. Is there any way to do so?? or how php based web applications protect their private files from direct access??
Thanks
You can place the PHP files where ever you would like to, and you can restrict access to the files directly by placing them in a folder (named whatever you like) and using .htaccess files to prevent users from accessing that folder directly (even if they were to access a PHP file directly, it would show a blank screen and not execute anything if all of your functions / classes are in these files only... likewise, they cannot in any ways access any of your PHP code as it's all server-side and only displays content you want it to output).
I have also heard of people putting some files in a folder above the http_docs/ or www/ folder so that the file and even the entire folder cannot be accessed at all from the website.
So, I would look up .htaccess files (and learn more about PHP!). Set up some PHP files that have functions in them and visit that page in your web browser and view the source, etc...
I have a PHP Yii application that uses an RSS feed reader. I wanted to develop some good tests, and I wanted to attempt to read an RSS feed under my own control as part of my test suite. The idea is I request this feed from "my.localhost/testfeed/{name}" and my application is served locally at "my.localhost"
I made a controller (TestFeedController) which uses the CViewAction to serve static rss files (stored in .php files). I had to put these in "protected/views/testFeed/pages/" to make it work.
I wanted to store the files in "protected/tests/views/testFeed/pages/" to separate them better from actual application code, but was unable to get this to work (I tried to overload getViewPath()). Is there a way to get a view file in CViewAction that is NOT in "protected/views/*"?
Is there a better way to test reading a "remote" RSS file which is under my local control? I considered serving it on another virtual host, but I wanted to keep the project tests with the project.
You could use RenderInternal for your test files and pass it an absolute filepath (e.g., dirname(FILE).'/../tests/views/..etc..'
I'm wanting to generate a dynamic zip/gzip file using php (user clicks button/link/whatever, and sends file to download) which contains a folder structure that may or may not have files (could be blank or contain data) in all the folders.
The aim is to be able to unzip this file and have a folder/file structure ready to go.
I believe I am familiar with everything involved, except how to generate the file without creating the files on my server locally, then zipping them and sending for download. This seems to include 'writing the files to the server' as a sort of middle man that I would just assume bypass if possible.
I would like to assume this is as easy as sending raw data to the browser with a zip/gzip header, but I haven't had much luck finding any information on this.
Any pointers in the right direction would be most helpful.
You can do that with the ZipArchive class. Have a look at function such as ZipArchive::addFromString() which will allow you to add files to the archive without actually saving it them to disk first.
I would like to write a setup script for my PHP application, which dose a minimum requirements check, gets the DB credentials, DB prefix and saves them, creates the db tables and so on. Now I would like to know what is the best practise to write and save the DB credentials? Write them as an array into a .php file and? Or into an XML file?
I don't think there is a best practise for this, there are so many ways people use configuration files. Some use PHP arrays, some use XML files, some use INI files, some use JSON files and I'm sure some people create proprietary formats.
What you do want to take in account is, where will you store this file. If it is in the document root, people could request it. XML/INI/JSON files are plain-text and by default, will make it easy for people to 'retrieve' the file contents. PHP will be parsed server side so just returns an empty page.
Ideally you'd store the configuration file outside of the document root, but not all webhosts allow you to do so. So I'd say, if you want to release an application people can install themselves easily, a PHP file might be the easiest way to go.
Write them as an array into a .php file. This satisfies speed (no xml parser and file touching is needed per-page), and security (.php files don't get served as text like your xml would).
I also tend to put the private.php that contains my mysql credentials in the directory above the http root, and load it like require_once("../private.php");
You are asking about setting up the environment, correct? If that is the case, then it depends on the script or build system itself. We are using Ant where such configuration is stored in build.properties. For example:
# database credentials
db.host=localhost
db.user=root
db.pass=root
db.name=db_name
This file is working copy specific and as such is not a part of our VC, however, build.properties.dist is. This way A's local settings don't override B's.
If the question is about something else, please, do tell :)
Can Google Map API GXml parse .kmz files directly? if not how is the best way to convert .kmz file to .kml? The .kmz file is stored on database and PHP code is used to retrieve it.
I'm pretty sure that there's no way to unpack zipped data with Javascript, which you'd have to do before passing the data to GXml.parse.
GGeoXml can handle KMZ files. It does it by passing the URL to a Google server which unzips the data and parses it there, then returns the individual overlay objects to the Javascript client.
Since you're reading the data with PHP, you might consider unzipping it in your PHP script, and serving the unzipped data to the client. You may need to do some work to your PHP configuration in order to enable the PHP Zip File functions.