Super-light weight database engine to distribute part of tiny PHP script - php

I am looking for a super-light weight open-source database engine (could be a library that mimics one) to be packaged part of a tiny PHP script distributed to people without sudo access. Basic CRUD, no need for any complicated implementations with string search, etc.
I found txtSQL (uses flat files, which I believe is the way to go) but hesitant to use it given the last time it was updated (2005-03).
Suggestions anyone?

sqlite gives you a platform-independent file format and is heavily regression tested and widely used. It is also available in PHP via SQLite3.

sqlite is about as light as you can get and it does everything via text files.

Related

Reading perl flat file db with php

I have an old flat file perl db that's part of an eCommerce site I want to migrate to a new php application.
Is it possible to read with php the "table" files have no file extension and seem not be just csv's or similar?
If I understand your question correctly, you have the kind of Perl database that's accessed with a so-called bound hash.
This uses technology generically known as dbm. The most recent implementation is gdbm, a GNU version, that's described here. http://www.gnu.org.ua/software/gdbm/ It's likely (but not 100% certain) that's the version used by the Perl infrastructure of your old app.
There's a PHP API with functions like dba_open() that also supports dbm variants. http://www.php.net/manual/en/ref.dba.php . You should be able to handle that file of yours with it.
It's worth observing that this dba_ extension wasn't loaded in my php implementation until I enabled it explicitly. You may have to mess around with various dbm implementations until you find the one that matches. The three I know about are the original UNIX one, dbm, ndbm, and gdbm.

OpenOffice in server mode with PHP on Windows

I am working on a project which requires me to generate documents as docx and then convert to doc and pdf.
This project is written in PHP (using Zend Framework) and running on IIS on Windows (client requirements - definitely not my choice!).
Windows Server
IIS 7.5
PHP 5.3
OpenOffice 3.2
I am researching the ways in which I can carry out the document conversion (including the LiveDocx service) and am currently looking into using OpenOffice running as a service to convert the documents.
I have a PHP script which works -- it is similar to the code in this post How do I convert RTF to PDF from my PHP web page using OpenOffice? -- but I wanted to know how well this will scale. the PHP script is basically a PHP version of the PyOD converter using PHP's COM functions. On this page (http://code.google.com/p/jodconverter/wiki/GettingStarted) it specifically says that the PyOD script is not intended for to work with multiple concurrent connections. I would therefor assume that the PHP script will be equally unsuitable.
Having read around, it seems that the OpenOffice process which is running will only support one connection at a time. Is this definitely correct? If so then am I right in thinking that it is simply not a viable solution? I would be expecting high usage for the product so concurrent conversions are a must. Does anyone have any experience with this in a production environment?
In finishing, does anyone have any other recommendations for carrying out the conversions? If not, I will go back to using the LiveDocx service. My only real gripes with it were speed and some inaccuracies with the conversions.
Thank you in advance for your help.
You probably can scale OpenOffice to do what you require. Having done work directly with the OpenOffice UNO api in the past, you might find you have a lot of work to do though. Trying to use a single OpenOffice process in a multithreaded-fashion only lead me to grief. You can however spawn several OpenOffice processes and single-thread each. It depends on what your performance criteria are as to whether this is going to be scalable enough...
JODReports and Docmosis sit over OpenOffice so it might be worth trying one of these systems out to see if you could potentitally scale to what you require before investing any development effort yourself with the particular technologies. You might also look at LibreOffice which has evolved a little further recently than OpenOffice.
Hope that helps.

Performance Issues with Zipped Archives in PHP

First Some Background
I'm planning out the architecture for a new PHP web application and trying to make it as easy as possible to install. As such, I don't care what web server the end user is running so long as they have access to PHP (setting my requirement at PHP5).
But the app will need some kind of database support. Rather than working with MySQL, I decided to go with an embedded solution. A few friends recommended SQLite - and I might still go that direction - but I'm hesitant since it needs additional modules in PHP to work.
Remember, the aim is easy of installation ... most lay users won't know what PHP modules their server has or even how to find their php.ini file, let alone enable additional tools.
My Current Objective
So my current leaning is to go with a filesystem-based data store. The "database" would be a folder, each "table" would be a specific subfolder, and each "row" would be a file within that subfolder. For example:
/public_html
/application
/database
/table
1.data
2.data
/table2
1.data
2.data
There would be other files in the database as well to define schema requirements, relationships, etc. But this is the basic structure I'm leaning towards.
I've been pretty happy with the way Microsoft built their Open Office XML file format (.docx/.xlsx/etc). Each file is really a ZIP archive of a set of XML files that define the document.
It's clean, easy to parse, and easy to understand.
I'd like to actually set up my directory structure so that /database is really a ZIP archive that resides on the server - a single, portable file.
But as the data store grows in size, won't this begin to affect performance on the server? Will PHP need to read the entire archive in to memory to extract it and read its composite files?
What alternatives could I use to implement this kind of file structure but still make it as portable as possible?
Sqlite is enabled by default since PHP5 so most all PHP5 users should have it.
I think there will be tons of problems with the zip approach, for example adding a file to a relatively large zip archive is very time consuming. I think there will be horrible concurrency and locking issues.
Reading zip files requires a php extension anyway, unless you went with a pure PHP solution. The downside is most php solutions WILL want to read the whole zip into memory, and will also be way slower than something that is written in C and compiled like the zip extension in PHP.
I'd choose another approach, or make SQLite/MySQL a requirement. If you use PDO for PHP, then you can allow the user to choose SQLite or MySQL and your code is no different as far as issuing queries. I think 99%+ of webhosts out there support MySQL anyway.
Using a real database will also affect your performance. It's worth loading the extra modules (and most PHP installations have at least the mysql module and probably sqlite as well) for the fact that those modules are written in C and run much faster than PHP, and have been optimized for speed. Using sqlite will help keep your web app portable, if you're willing to deal with sqlite BS.
Zip archives are great for data exchange. They aren't great for fast access, though, and they're awful for rewriting content. Both of these are extremely important for a database used by a web application.
Your proposed solution also has some specific performance issues -- the list of files in a zip archive is internally stored as a "flat" list, so accessing a file by name takes O(n) time relative to the size of the archive.

Translate PHP site trough DB or Local Files?

I have php db driven website that uses a lot of flash for user interaction.
I need to make it multilangual like 20+ languages.
Site is quite large and has a lot of users coming to it every day.
Other developer i work with saying we should store translation in local files e.g. /lang/english.php /lang/german.php etc.
I was thinking since database is on the same dedicated server there should not be a slow down, which way you think will work is faster?
I don't know if it's an option, but you could also use gettext().
That way your translations are stored in local files (faster than a database) and you have the advantage that there are programs like poedit (takes some getting used to...) that you or a translator can use to automatically generate the translation files so it's a bit easier to maintain then php files.
Local files are a LOT faster than DB content (Although you can save the DB output in a local cache, like files or even memcache or APC), probably not that easy to translate, but it will help you with the basic speed of implementation too, You should take a look at:
http://framework.zend.com/manual/en/zend.translate.html
You can use only this part of the framework and it will give you a HUGE boost, it supports DB based translation or local files (a lot of adapters)
UPDATE:
thanks Corbin, you are right, it's better to have the direct link.

Convert a class to an extension

I have a PHP class I want to convert to a PHP extension. I checked some tutorials (tuxradar's writing extensions, php.net's extending php, and zend's extension writing) and it's a bit complicated.
I found the article "How to write PHP extensions" (ed note: site is defunct) and I wanted to know if it is possible to use this to make it grab a PHP class from a certain path (say /home/website1/public_html/api/class.php), execute it and return the class instance.
This way it will be usable in other websites that are hosted on the same server – each can simply call the function and it will obtain its own instance.
Is that possible?
The question as I understand it now is, The user has a PHP class that they would like to share with multiple people, but does not want to share the source code.
There are many solutions to this, they generally invovle turning the PHP code into some kind of byte code, and using a PHP extension to run the byte code. I've never used any of these solutions, but I'm aware of the following:
phc is an open source compiler for PHP
Zend Guard
HipHop for PHP - I'm unsure about this, but Facebook recently released it so it might be worth a look.
I'm sure there are others. Just Google for PHP Compiler, or PHP Accelerator.
In one sentence: I don't believe so, I think its a lot more work than that.
No, there is not tool that can do that.
Anyway, what you want call be easily accomplished with auto_prepend_file. Just make that ini directive point to a PHP file that has the class definition, and then it will be available to all the applications.
If you don't want the users to be able to use the source, you can use one the several zend extensions that allow you to pre-compile the file and use it in that form.
You can extend underlying C library functions into PHP space by writing PHP extensions. However, i think in your case you don't need to write one.
I am aware that this is an old question (being from 2012) however the answer has changed and there is now a tool that can do this. Jim Thunderbirds PHP-to-C Extension toolset provides the means to take a simple class in one file all the way up to a complicated multi file multi-level namespaced framework and convert it to a C-extension that can then be installed into your PHP server.
While in many use cases doing so is not needed as the ordinary PHP code will work just as good in some cases significant performance improvements can be experienced. The information page shows that an ordinary class (deliberately designed to take a long time) took 16.802139997482 seconds as plain vanilla PHP, and 3.9628620147705 as a PHP extension built using the tool.
As an added advantage the tool also provides an additional feature. The ability to combine PHP code (to be converted to C) and native C code within the same extension which can produce even greater performance enhancements. The same example used above only tool 0.14397192001343 seconds when much of the intensive code was moved to a bubble sort C code and simply calling it from within the PHP code.
As a side note functionally to the end developers using the code using the extension is very much similar to having the files manually included in the PHP file being developed except it doesn't have to be specifically included as it is done through the PHP extensions component.
(Disclaimer: I am not affiliated with this developer but am glad to have come across it as it is thus far working for converting some of my intensive classes into PHP extensions without needing to know C).

Categories