I have the following work flow which I'd like to automate with short PHP script.
Download gzfile (with db dump) from specific URL (potentially FTP).
Decode the file to txt.
Import the txt to local postgre with psql (using cmd).
Now I have 2 questions:
What is the best way to pass the gunzipped file to pg_query?
I get an error when PHP reaches this line:
COPY rf (datum, id_emailu_op, recency, frequency) FROM stdin;
2011-08-29 8484 3 1. Can the stdin be a problem?
Thank you all!
A pg_dump file is meant to be imported via psql. You can load the file contents in, and even decompress it with php, then open a pipe to psql writing data out to that process (assuming you are on a unix machine). When psql is executed in this way as far as it's concerned the data you're writing via your php script is coming in via stdin.
Related
I'm tryng to find a solution for upload data on a remote mysql database using a python daemon.
My base script use a simple query "INSERT INTO...." but in top of the script there are in clear the credentials to connect on database:
conn = MySQLdb.connect(host="192.168.1.123", user="root", passwd="Pa$$w0rd", db="mydb")
I do not want anyone reading the python script can access the database directly.
I need to use python scripts on clients
The server with mysql db is on cloud
The clients are Raspberry pi
In the server I can use php
Several ways
1. Use mysql.cnf file
Mysql cnf files is a config file storing MySQL credentials, if this server or client executing the script has one, you can use it like:
db=_mysql.connect(host="outhouse",db="thangs",read_default_file="~/.my.cnf")
Credentials are not in your Python script anymore, but they are clear in cnf file anyway and you must have this file in the same place everywhere you want to use your script.
source: http://mysql-python.sourceforge.net/MySQLdb.html
2. Command line argument
You can parse command line arguments to get credentials from it like:
import sys
user = sys.argv[1]
password = sys.argv[2]
conn = MySQLdb.connect(host="192.168.1.123", user=user, passwd=password, db="mydb")
And so execute your script with:
python myscript "root" "pa$$w0rd"
With this method credentials can't be found in any config file, but you have to execute it with arguments, if it's a deamon it can be ok, but if you want to use it as cron (by example), you will have to write credentials in crontab, so not so safe.
3. Environment variables
Another way is to use environment variables
import os
conn = MySQLdb.connect(host="192.168.1.123", user=os.environ['MYSQL_USER'], passwd=os.environ['MYSQL_PASSWORD'], db="mydb")
But you will have to set these variables somewhere. In ~/.bash_profile, /etc/profile or by command line. So if somebody access to user that can execute script, he can read password.
4. Encoding
Encoding seems to be a good way to hide password, but in fact you can not really hide from someone who can access to the right user in the right server.
Encoded string without salt is easy to decode, some encoding methods are easy to spot and can be decoded by anyone.
Using a salt will make work more harder, but if somebody can access to you code, it will be easy to locate salt phrase (no matter if salt is stored in environment var, in a file or directly in code).
Using .pyc files can be an idea too, but first, it's not recommended and anybody can read content by creating a python script importing this pyc file and print what stored in.
Encoding credentials still is still a good thing, but encoded string can always be decoded, if your code can, somebody with access to it can too. A simple print password added in your code and regardless of the used method the password will be accessible.
So you have to secure your python code, but Unix users & groups too and mysql config too.
Scenario:
I have built a PHP framework that uses a postgresql database. The framework comes shipped with a .sql file which is a dump of the default tables and data that the framework requires.
I want to be able to run the sql file from the client (PHP), rather than the command line, in order to import the data. This is because I have come across some server setups where accessing the command line is not always a possibility, and/or running certain commands isn't possible (pg_restore may not be accessible to the PHP user for example).
I have tried simply splitting up the .sql file and running it as a query using the pg_sql PHP extension, however because the dump file uses COPY commands to create the data, this doesn't seem to work. It seems to be that because COPY is used, the .sql file expects to be imported using the pg_restore command (unless I am missing something?).
Question:
So the question is, how can I restore the .sql dump, or create the .sql dump in a way that it can be restored via the client (PHP) rather than the command line?
For example:
<?php pg_query(file_get_contents($sqlFile)); ?>
Rather than:
$ pg_restore -d dbname filename
Example of the error:
I am using pgAdmin III to generate the .sql dump, using the "plain" setting. In the .sql file, the data that will be inserted into a table looks like this:
COPY core_classes_models_api (id, label, class, namespace, description, "extensionName", "readAccess") FROM stdin;
1 data Data \\Core\\Components\\Apis\\Data The data api Core 310
\.
If I then run the above sql within a pgAdmin III query window, I get the following error:
ERROR: syntax error at or near "1"
LINE 708: 1 data Data \\Core\\Components\\Apis\\Data The data api Core...
This was a bit tricky to find, but after some investigation, it appears that pg_dump's "plain" format (which generates a plain-text SQL file) generates COPY commands rather than INSERT commands by default.
Looking at the specification for the pg_dump here, I found the option for --inserts. Configuring this option will allow the dump to create INSERT commands where it would normally create COPY commands.
The specification does state:
This will make restoration very slow; it is mainly useful for making dumps that can be loaded into non-PostgreSQL databases. However, since this option generates a separate command for each row, an error in reloading a row causes only that row to be lost rather than the entire table contents.
This works for my purposes however, and hopefully will help others with the same problem!
What is the best way to generate an excel file and inform the user after that?
I am using PHPExcel to generate an excel file from an MSSQL Server to a webserver and allow a user to download it using a link. The problem is that the each time we try to execute the PHP script it always throws a fast-cgi timeout error. The script needs to read up to 2000 - 5000 rows of data.
We tried to execute it via command prompt using exec() and Shell. It successfully generates the file in the server, but we don't have a way/method in informing the user after the script is completed.
exec() should return the result of running the external program - can't you use it? You can move generated file to a directory that is reachable for user and just give him the URL to the file.
i have been making a text writer where u can create a file and then u can also store it into a folder of d web server, like wase a cloud computing system.
but i have a prob that how to save the file when the menu is clicked. i am trying it with exec() function of php, to run a scritp but not getting how to invoke the script on click event. plz hepl out, ur input will be highly appriciated.
and also is there option to open a saved file on click into the text writer dynamically.
thanks!!
Why can't you just save it with fwrite() function?
I mean, it's a usual method of saving files – fopen() it for writing, fwrite() data and finally fclose() the file. You either use general approach or tell us more about your specific needs.
If you still need to run unix/linux commands from PHP, the option to debug what's happening is to log in to command line as a user PHP is running under – e.g. www-data or nobody. You can find it out by running this in your script (see the backtick execution operator):
print `whoami`;
A client has a Windows based in-house server, on which they edit the contents of a CMS. The data are synchronized nightly with the live web server. This is a workaround for a slow Internet connection.
There are two things to be synchronized: New files (already sorted) and a mySQL database. To do this, I am writing a script that exports the database into a dump file using mysqldump, and uploads the dump.
The upload process is done using a 3rd party tool named ScriptFTP, an FTP automation tool.
I then need to run a PHP based import script on the target server. Depending on this script's return value, the ScriptFTP operation goes on, and some directories are renamed.
I need an external tool for this, as scriptFTP only supports FTP calls. I was thinking about the Windows version of wget.
Within scriptFTP, I can execute any batch or exe file, but I can only parse the errorlevel resulting from the call and not the stdout output. This means that I need to return errorlevel 1 if the PHP import operation goes wrong, and errorlevel 0 if it goes well. Additionally, obviously, I need to return a positive errorlevel if the connection to the import script could not be made at all.
I have total control over the importing PHP script, and can decide what it does on error: Output an error message, return a header, whatever.
How would you go about running wget (or any other tool to kick off the server side import) and returning a certain error level depending on what the PHP script returns?
My best bet right now is building a batch file that executes the wget command, stores the result in a file, and the batch file returning errorlevel 0 or 1 depending on the file's contents. But I don't really know how to match a file's contents using batch programming.
You can do the following in powershell:
$a = wget --quiet -O - www.google.com
$rc = $a.CompareTo("Your magic string")
exit $rc