How should I access a WebDAV file server with PHP4? - php

I've looked online for ways to do this, and I've found two PHP methods for accesing WebDAV:
http://freshmeat.net/projects/class_webdav_client/
This is less than ideal, because it doesn't support WebDAV at a sub-path of the server; it cannot access, say, http://my-dav-server/configuration, only http://my-dav-server
http://php-webdav.pureftpd.org/project/php-webdav
This requires me to compile a new PHP module, which might be necessary, but is a bit of a pain. Plus, it's not clear from the docs how to do simple things like report errors or which versions of PHP it supports.
Basically, I want a WebDAV API - doesn't matter how complex, really - that can get/put files with HTTP BASIC authentication. I don't need anything more complex than that. I'm backing this with a subversion autoversioning DAV server, but I can foresee using it in other ways, too, so I don't want to lock myself in to subversion by using an SVN-specific API.

If you are just looking for GET and PUT, just use Curl! That, or any other decent HTTP library.
It's actually quite simple that way.

Related

What are the best practices for a php from Deployment Winxp to Winxp?

Where i begin to work, they are using a Remote Desktop Conection, to transfer files and manage the server, but that seems to me really insecure and a good way to cause errors and take down the apache/php/mysql stack.
I was proposing FTP to transfer files more easyly (and secure compared to the other way) but started reading about php deployment. It seems pretty easy on Linux, but on windows i havent found out wich is the best way to do it..
So far i think git on the server, and comit to it from the developer is my best shot, but what about database deployment?
Phing/jenkins/capistrano seem overly complex.. but will try if you guys think is good
The approach I am using is database migration scripts. They look like this
db-update-001.sql
db-update-002.sql
I have a script which sequentially executes them and creates *.ok file for each if it's successful. *.sql files contain "alter" statements and are stored in Git. The .ok files are not stored, so if you distribute changes, you need to imprint only those without .ok files.
I use this file: https://github.com/atk4/atk4/blob/master/tools/update.sh
but since you are in MS environment, you might need to do something different.
While MSRDP is not the most secure protocol around, its a very long way aead of FTP.
FTP is intrinsically insecure - it sends passwords as clear text. It's also a PITA to manage across a stateful firewall even where you can ensure consistent PASV behaviour.
However you do need a method for transfering files which can be scripted / automated.
I'd go back and have a long hard look at the available deployment tools - I can't comment on how well the other products compare with phing, only having used the latter - however mostly I've used stuff developed in house.
Since you really should be using a version control system - I'd recommend considering using it as your file delivery mechanism too.

Two ways to make python based webpages?

I wanted to try out python to create webpages instead of using php. However I came across that you need either mod_python or mod_wsgi installed to apache to make it play with python. If you now use pure, i'm not sure if it should be said pure, python code, not using any web frameworks like django. I found out that making a simple page looks differently in mod_python and in mod_wsgi.
How come?, the more I looked into python it just seemed to be a harder language to use to make webpages comparing it to php. Is there some good starting point to learn python webdevelopment?
Sorry if my question is blurry. I simply want some guidance to start out with python webdevelopment
Yes, making a webpage with python without using a web framework is harder than it is in php. This is by design, since it gives you a great deal more control over how your page interacts with the server, allowing you to build sites that scale well, among other benefits. WSGI is the modern way to interact with a server, but as you observed, it includes many of the nuts and bolts that PHP hides from the user.
If you're looking for a php-like experience for python, you might look at web.py or Flask. They are pretty minimalistic as far as frameworks go, and take care of interacting with the server but otherwise stay out of your way.
That said, you really should consider Django or another similar framework - they provide some really great benefits that help you get what would otherwise be painfully complex sites going quickly. They solve a slightly different problem and provide different solutions from the common PHP frameworks, so you should consider them even if you don't like frameworks in PHP.
If you want to do things in an even more php-like fashion, you could use CGI. It's definitely not a recommended solution, and won't teach you best practices moving forward, but it can get you started...
Really though, consider a framework. It's how most development in Python for the web is done, and you'll learn more useful skills if you develop using one.
mod_wsgi is better, because it's based on the WSGI specification, which defines the interface between web applications (or frameworks) and web servers. A WSGI app at its simplest is nothing more than a function that sends some HTTP headers via a callback and returns a string in response to information about an HTTP request. And since WSGI is implemented by many web servers, you aren't tied to Apache.
The closest you can get to pure frameworkless web development in Python is to write the WSGI app directly. This will really help you see the things that a framework like Django will obscure.
To make things easier, you might consider using Werkzeug, which is a utility library for WSGI. It has many components that are framework-like, but you can choose which ones you want and which ones you don't. For example, it has a really neat system for parsing and dispatching URLs. Werkzeug also has a simple command-line WSGI server which is probably better for development than Apache.
I'm replying to you with some advice, as someone who was in a very similar situation as you just a few months ago.
So you're using apache to host your website. That's cool. To make python play nice with apache, you're going to want to use mod_wsgi, for the reasons others have stated: seperation of concerns makes it better than cgi and mod_python is no longer being supported.
However, your impression that foregoing a framework will bring you closer to programming in "pure" python is a little bit off the mark. I shared the same opinion, and experimented with both Django and using only mod_wsgi. Let me share what I found.
Mod_wsgi is an implementation of the WSGI standard found in PEP 333. The distinction between the implementation and the standard is important. First, because it means that WSGI compliant applications will work across implementations. More importantly, it reveals something important about what WSGI is meant to do. That is, WSGI is intended a standard for writing frameworks. From the PEP:
simplicity of implementation for a framework author is not the same thing as ease of use for a web application author
and
The goal of WSGI is to facilitate easy interconnection of existing servers and applications or frameworks, not to create a new web framework.
I'm not saying that you shouldn't do something with wsgi, but you should expect to be writing a framework more than an application. If you're interested in writing a simple framework this tutorial is where I started.
However, if you're just looking to make a website, look into one of the frameworks that others have suggested. You'll still be writing python code, but the authors have worked hard to make the code you write closer connected producing websites than producing frameworks. I've personally used Django, and once it was up and running, it was rather painless to churn out interesting applications. Also, their documentation is very good, and they have a good tutorial here. That being said, Django is very fully featured, and if you're looking for something a little more minimalistic, I've heard good things about Flask, but there are lots of other options as well.
You can use ordinary CGI, which is really simple. Create a Python program that looks something like this:
#!/usr/bin/env python
import sys
sys.stdout.write("Content-type: text/html\r\n\r\n")
print("Hello <em>world</em>!")
Make this file executable (chmod +x) and put it in a directory you've configured for CGI files in your web server.
You will also find the standard Python cgi module very helpful.
If your goal is for making your python program web friendly then the answer is Cherrypy. It is a very flexible and simple framework that enables your python objects exposed in web. Check it out and it has a nice web server built-in that you don't need apache/mod_wsgi etc.,

What are nice use cases for cURL in PHP?

It's evident that the cURL functions are very widely used. But why is that? Is it really only because the extension is mostly enabled per default?
While I can certainly relate to not introducing 3rd party libraries over builtins (DOMDocument vs phpQuery), using curl appears somewhat odd to me. There are heaps of HTTP libraries like Zend_Http or PEAR Http_Request. And despite my disdain for needless object-oriented interfaces, the pull-parameter-procedural API of curl strikes me as less legible in comparison.
There is of course a reason for that. But I'm wondering if most PHP developers realize what else libcurl can actually be used for, and that it's not just a HTTP library?
Do you have examples or actual code which utilizes cURL for <any other things> it was made for?
Or if you just use it for HTTP, what are the reasons. Why are real PHP HTTP libraries seemingly avoided nowadays?
I think this would be related to why do people use the mysql functions instead of mysqli (more object oriented interface) or take a step further and use a data abstraction layer or PDOs.
HTTP_Request2 says that there is a cURL adapter available to wrap around PHP's cURL functions.
Personally a lot of the PEAR extensions I have tried out, I haven't been that impressed with (and I feel less confident with PEAR libraries that are sitting in alpha that haven't been updated in a long time). Whereas the HTTP_Request2 Library does look quite nice
I for one would have used cURL without thinking of looking at a possible PEAR library to use. So thanks for raising my awareness.
The libraries you mentioned aren't default, and from my experience in PHP, I prefer to use less of such libraries; they enable a broader attack surface, decrease reliability, open to future modification/deprecation more than PHP itself.
Then, there's the sockets functionality which, although I've used some times, I prefer to rely on a higher level approach whenever possible.
What have I used CURL for?
As some may know, I'm currently working on a PHP framework. The communication core extension (appropriately called "connect") makes use of CURL as it's base.
I've used it widely, from extracting favicons form websites (together with parser utilities and stuff) to standard API calls over HTTP as well as the FTP layer when PHP's FTP is disabled (through stream wrappers) - and we all know native PHP FTP ain't that reliable.
Functional reasons as mentioned in the comments:
It's very old, [widely used and] well tested code, works reliably
is usually enabled by default
allows very fine grained control over the details of the request.
This might need expanding. By nature of the common-denominator protocol API cURL might provide features that plain HTTP libraries in PHP can't...
Historic reasons:
curl used to be the only thing that could handle cookies, POST, file uploads...
A lot of curl use probably comes from tutorials that pre-date PHP 5.

PHP: an example where allow_url_include is a good idea?

I just noticed a PHP config parameter called allow_url_include, which allows you to include a PHP file hosted elsewhere as you would a locally. This seems like a bad idea, but "why is this bad" is too easy a question.
So, my question: When would this actually be a good option? When it would actually be the best solution to some problem?
Contrary to the other responders here, I'm going to go with "No". I can't think of any situation where this would make a good idea.
Some quick responses to the other ideas:
Licensing : would be very easy to circumvent
Single library for multiple servers: I'm sorry but this is a very dumb solution to something that should be solved by syncing files from for example a
sourcecontrol system
packaging / distribution system
build system
or a remote filesystem. NFS was mentioned
Remote library from google: nobody has a benefit to a slow non-caching PHP library loading over PHP. This is not (asynchronous) javascript
I think I covered all of them..
Now..
your question was about 'including a file hosted elsewhere', which I think you should never attempt. However, there are uses for allow_url_include. This setting covers more than just http://. It also covers user-defined protocol handlers, and I believe even phar://. For these there a quite a bit of valid uses.
The only things I can think of are:
for a remote library, for example the google api's.
Alternately, if you are something like facebook, with devs in different locations, and the devs use includes from different stage servers (DB, dev, staging, production).
Once again during dev, to a third party program that is in lots of beta transition, so you always get the most recent build without having to compile yourself (for example, using a remote tinymce beta that you are going to be building against that will be done before you reach production).
However, if the remote server goes down, it kills the app, so for most people, not a good idea for production usage.
Here is one example that I can think of.
At my organization my division is responsible for both the intranet and internet site. Because we are using two different servers and in our case two different subdomains then I could see a case for having a single library that is used by both servers. This would allow both servers to use the same class. This wouldn't be a security problem because you have complete control over both servers and would be better than trying to maintain two versions of the same class.
Since you have control over the servers, and because having an external facing server and internal server requires seperation (because of the firewall) then, this would be a better solution than trying to keep a copy of the same class in two locations.
Hmm...
[insert barrel scraping noise here]
...you could use this is a means of licensing software - in that the license key, etc. could be stored on the remote system (managed by the seller). By doing this, the seller would retain control of all the systems attempting to access the key.
However, as you say the list of reasons this is a horrifying idea outweigh any positives in my mind.

PHP Difference between Curl and HttpRequest

I have a need to do RAW POST (PUT a $var) requests to a server, and accept the results from that page as a string. Also need to add custom HTTP header information (like x-example-info: 2342342)
I have two ways of doing it
Curl (http://us.php.net/manual/en/book.curl.php)
PHP HTTP using the HTTPRequest (http://us.php.net/manual/en/book.http.php)
What are the differences between the two? what's more lean? faster? Both seem pretty much the same to me...
Curl is bundled with PHP, HTTPRequest is a separate PECL extension.
As such, it's much more likely that CURL will be installed on your target platform, which is pretty much the deciding factor for most projects. I'd only consider using HTTPRequest if you plan to only ever install your software on servers you personally have the ability to install PECL extensions on; if your clients will be doing their own installations, installing PECL extensions is usually out of the question.
This page seems to suggest that HTTPRequest uses CURL under the hood anyway. Sounds like it might offer a slightly more elegant interface to curl_multi_*(), though.
HTTPRequest (and the PECL extension) is built on libcurl.
http://us.php.net/manual/en/http.requirements.php
The HTTPRequest is really just an easier/more syntactically friendly way to perform the same task.
As Frank Farmer mentioned, you're more likely to have a target platform with curl already installed and could have difficulty getting the PECL library installed by the hosting provider.
The HTTPRequest is "kind of" a wrapper for curl. This two quotes from the manual should give you a clue:
It provides powerful request functionality, if built with CURL support. Parallel requests are available for PHP 5 and greater.
The extension must be built with ยป libcurl support to enable request functionality (--with-http-curl-requests). A library version equal or greater to v7.12.3 is required.
Said that (and said that I've never used this extension myself), looks like if you want your code to look more object oriented, you can go for this one, but it might be a bit slower, though nothing compared with the external call that you are going to make, so I won't consider performance to make my choice. I would give preference to the fact that curl is built in and this other you have to add it yourself, which is unconvenient and reduces portability in case you want to host your app in a shared environment that you don't control.
For the needs that you explained in your question, I would definitely go for curl.

Categories