Asterisk in string causes file listing? - php

Oh my good lord. I am trying to update some coding left behind by the previous programmer. The issue I'm running into is that when I try to write an SQL statement, the asterisk gets replaced by a file listing.
he previous guy liked to run PHP scripts like shell scripts. (If all you have is a hammer...) So, this is running from the command line.
// Here's Jason's additions. We add split comissions to the splits table and merge them with the records coming out of GP.
$jemsql = "select * from splits where salesPerson = '$salesCode'";
print "JemSQL: ".$jemsql."\n\n";
$sp_data = $mysql->query($jemsql);
$headData = array_merge($headData, $sp_data);
Basic PHP, right? Well, I keep getting an SQL error and the logging reveals...
select MJS_new.report.php OLE Spreadsheet Spreadsheet_Excel_Writer-0.9.2.tgz (long file listing...)
from exception where salesPerson = 'JD'
I went back and ran his original script, and the problem is OEM. He has the same problem in the same place.
Help me out here, why is the asterisk returning a file listing? How can the asterisk get overwritten? Is there something I can do to stop this behaviour? More code if needed, but I am stumped. How does this even happen?
EDIT:
I have minimized the file, keeping only what is absolutely needed:
#!/usr/local/bin/php
<?
date_default_timezone_set("America/New_York");
$db_host = 'datasrv';
$db_user = 'XXXX';
$db_pass = 'XXXX';
$db_name = 'financials';
mysql_connect($GLOBALS["db_host"], $GLOBALS["db_user"], $GLOBALS["db_pass"]) or die('Could not connect: ' . mysql_error());
mysql_select_db($GLOBALS["db_name"]) or die('Could not select database'.$db);
// Here's Jason's additions. We add split comissions to the splits table and merge them with teh records coming out of GP.
$jemsql = "select * from splits where salesPerson = '".$salesCode."'";
print "JemSQL: ".$jemsql."\n\n";
$sp_data = mysql_query($jemsql)
?>
When I run this with the shell script that is normally used to kick it off, here's the output:
JemSQL: select MJS_new.report.php OLE Spreadsheet Spreadsheet_Excel_Writer-0.9.2.tgz (long file listing) from splits where salesPerson = ''
If I just run it from the command line:
/usr/local/bin/php ./min_report.php JD test.pdf
JemSQL: select * from splits where salesPerson = ''
Here's the shell script that calls it:
for salesPerson in ${sales[*]}; do
fileTarget="$mountTarget$salesPerson/$fileName";
dirTarget="$mountTarget$salesPerson/";
out=`/usr/local/bin/php ./min_report.php $salesPerson $fileTarget`
echo $out;
out=`cp reports/$salesPerson.xls $dirTarget`;
echo $out;
echo $salesPerson;
cnt=`expr $cnt + 1`
done
So, at least I'm clued in to the problem... And I'm quite curious to know why he chose that "echo" method of running a command in the bash shell.

Related

PHP MySQL system call creates empty file, has no return val, and produces no error message

I've burned more hours than I care to admit trying to figure this out :) A file is created successfully, however it is either 0 bytes or includes the mysql man page. I cant seem to get this to execute correctly or get an error message out.
The query executes correctly in MySQLAdmin. I can replace the mysql call with ls and it pipes a listing of the files into my output file. The host, user, and password strings are all correct and formatted well. Leads me to believe there is something wrong with my syntax?
$command = 'mysql --host=localhost --user='.DBASEUSER.' --password='.DBASEBPSWD.' --database='.DBASE.' -execute=SELECT Real_acct.Mail_Addr_1 from Real_acct > ../outputfiles/output.txt';
$returnVal = system($command, $returnVal);
You only have one dash (-) before your execute parameter. You would probably also need double quotes around your query statement:
$command = 'mysql --host=localhost --user='.DBASEUSER
.' --password='.DBASEBPSWD
.' --database='.DBASE
.' --execute="SELECT Real_acct.Mail_Addr_1 from Real_acct" > ../outputfiles/output.txt';
$returnVal = system($command, $returnVal);

UPDATE statement takes too long

Well I have this problem that I hoped someone could help me with:
So whats it about?
I have a developed PHP script that imports XML files from a folder in to a database.
XML file looks like this- XML file
Basically script stores information from the XML file in to 5 tables, and that works correctly.
But the problem is that my file does not contain ID information of players in the PLAYER object so after I import everything in to database I have to run this query:
$sql = "SELECT igraci.ID, utakmice.Player_ID, utakmice.ID AS broj FROM igraci LEFT JOIN utakmice ON (igraci.Team_ID = utakmice.Team_ID) AND (igraci.Surname = utakmice.Lastname) AND (igraci.Name = utakmice.Firstname);";
$tabela = mysql_query($sql);
$row = mysql_fetch_assoc($tabela);
$totalrow = mysql_num_rows($tabela);
$i=0;
do {
$i++;
$sql = "UPDATE utakmice SET Player_ID=" . $row['ID'] . " WHERE ID = " . $row['broj'] . "";
echo $sql."<br>";
mysql_query($sql);
} while ($row = mysql_fetch_assoc($tabela));
Select statement is executed really fast and I have no problem with that but the UPDATE command is making the script timeout.
I have tryed making the fields used in this QUERY indexes but that didn't help and as soon as I have more than 2200 rows the script fails.
The script was executing ok on older version of php but last month we had to upgrade to 5.3 and thats where the problem started.
Is there any way that I can speed this UPDATE up?
PS: XML file is from FIBA live Cms system.
Is it the php script timing out?
Do you need to do this as a SELECT followed by potentially a large number of updates?
Could you not just use a single UPDATE statement, something like this:-
UPDATE utakmice
INNER JOIN igraci
ON (igraci.Team_ID = utakmice.Team_ID)
AND (igraci.Surname = utakmice.Lastname)
AND (igraci.Name = utakmice.Firstname)
SET utakmice.Player_ID = igraci.ID
Add an INDEX on utakmice.ID to speed up the WHERE part.
If you're not sure about performance run:
EXPLAIN SELECT * FROM utakmice WHERE ID = [x]
See if it's using an index or doing a full table scan (index is good, table scan is slow)
Apart from setting an index on ID you can try batching your updates like explained in here.
You need to prepare a query by concatenating case-whens when neccessary. It's worth a try, but I haven't done any performance tests to see if it could give you a huge boost here.
In the end you'd get something like:
UPDATE utakmice SET title = CASE
WHEN id = <your_first_broj_from_result> THEN <your_first_id_from_result>
WHEN id = <your_second_broj_from_result> THEN <your_second_id_from_result>
...
END
WHERE id IN (<your_first_broj_from_result>, <your_second_broj_from_result>,...)

GET php data to a commandline prompt

A PHP application on the server is saving a certain document with a sequential number into a MySQL database. How to obtaion that sequential number to a command line prompt that initiates the local doocument scanner?
ex:
c:\myscan ask_for_current_seq_nmbr.pdf
myscan is something written in c that takes care of the PC stuff. Only the name of file is unknown.
Some code (from the query PHP file)
$query = "SELECT last_seq FROM seq_table WHERE cat = 1";
$result = mysql_query($query, $link) or die('ERROR: '. mysql_error().'<br />ON LINE: '.__LINE__);
while($row = mysql_fetch_assoc($result)) {
echo $row['last_seq'];
}
!!! NOTE !!!
I am fetching a page from a remote server. ex. www.site.com/query.php?q=SELECT * FROM...
And that selection results in the last used sequential number which I would like to use in my command prompt.
!! UPDATE !!
We HAVE to go through a PHP file on the remote server to avoid having to use Remoote MySQL which has to be enabled on an IP basis.
You can call processes that run on the commandline with various function from PHP from the exec familyDocs.
If you're having problems building the actual command string, you can do with:
$cmd = sprintf('c:\myscan %d.pdf', $sequential_number);
As you write that the script is already writing it into the db with the $sequential_number I assume you have it already.
In case the database generates the number, then probably as the primary key. See mysql_insert_idDocs for obtaining the id.
Okay judging by the backslash and the C:\ I am guess you're using windows.
You are going to have to combine the following:
http://dev.mysql.com/doc/refman/5.5/en/mysql.html
How to store the result of a command expression in a variable using bat scripts?
and then to access the content of the variable you created use the %VARIABLE_NAME% syntax.
You should have flag in your mysql table like is_processed with value = 0 or 1.
When scan starts it runs query:
SELECT * FROM TABLE where is_processed = 0 order by sec_number
After processing you should run query:
UPDATE TABLE set is_processed = 1 where sec_number = 'sec_processed_number';

Read DB Write to flat file, times out between 15,000 & 35000 records

I am reading a MySQL database with a query that returns 187,000 records and I am writing the data to a flat file. It just stops without any error around 15,000 records to 35,000 records.
I thought maybe the database connection was timing out so I started pulling 10,000 records at a time with LIMIT, but it still happens. So I imagine it is either the browser or PHP that is timing out. Here is my code. If there is a better way of doing this I am totally open to hearing.
$sql->Query($stype.$search);
$checkrows = $sql->rows;
if ($checkrows > 0){
$fh = fopen($listname, 'w');
for ($i = 0; $i < $sql->rows; $i++) {
$sql->Fetch($i);
$email .= $sql->data[1]."\n";
fwrite($fh, $email);
$cot++;
echo $cot."-".$sql->data[1]."<br>";
}
fclose($fh);
}
If it is php timing out, try setting set_time_limit(0).
This is not in PHP per say, but you can easily export your database to a flat-file format using MySQL only. Using a query like this one:
SELECT email FROM database.table
INTO OUTFILE '/path/to/file/foo.txt'
LINES TERMINATED BY '\n';
However, this will write to the same server running MySQL. You can also do a similar thing by using the MySQL command-line client to write locally:
mysql -u user --password=mypass \
-e "SELECT email FROM database.table" \
-B --skip-column-names > foo.txt
Add set_time_limit(600); to the top! Your scripts times out.
My mistake was this line of code: echo $email .= $sql->data[1]."\n";
It should have been: echo $email = $sql->data[1]."\n";

How to speed up processing a huge text file?

I have an 800mb text file with 18,990,870 lines in it (each line is a record) that I need to pick out certain records, and if there is a match write them into a database.
It is taking an age to work through them, so I wondered if there was a way to do it any quicker?
My PHP is reading a line at a time as follows:
$fp2 = fopen('download/pricing20100714/application_price','r');
if (!$fp2) {echo 'ERROR: Unable to open file.'; exit;}
while (!feof($fp2)) {
$line = stream_get_line($fp2,128,$eoldelimiter); //use 2048 if very long lines
if ($line[0] === '#') continue; //Skip lines that start with #
$field = explode ($delimiter, $line);
list($export_date, $application_id, $retail_price, $currency_code, $storefront_id ) = explode($delimiter, $line);
if ($currency_code == 'USD' and $storefront_id == '143441'){
// does application_id exist?
$application_id = mysql_real_escape_string($application_id);
$query = "SELECT * FROM jos_mt_links WHERE link_id='$application_id';";
$res = mysql_query($query);
if (mysql_num_rows($res) > 0 ) {
echo $application_id . "application id has price of " . $retail_price . "with currency of " . $currency_code. "\n";
} // end if exists in SQL
} else
{
// no, application_id doesn't exist
} // end check for currency and storefront
} // end while statement
fclose($fp2);
At a guess, the performance issue is because it issues a query for each application_id with USD and your storefront.
If space and IO aren't an issue, you might just blindly write all 19M records into a new staging DB table, add indices and then do the matching with a filter?
Don't try to invent the wheel, it's been done. Use a database to search through the file's content. You can looad that file into a staging table in your database and query your data using indexes for fast access if they add value. Most if not all databases have import/loading tools to get a file into the database relatively fast.
19M rows on DB will slow it down if DB was not designed properly. You can still use text files, if it is partitioned properly. Recreating multiple smaller files, based on certain parameters, storing in proper sorted way might work.
Anyway PHP is not the best language for file IO and processing, it is much slower than Java for this task, while plain old C would be one of the fastest for the job. PHP should be restricted to generated dynamic Web output, while core processing should be in Java/C. Ideally it should be Java/C service which generates output, and PHP using that feed to generate HTML output.
You are parsing the input line twice by doing two explodes in a row. I would start by removing the first line:
$field = explode ($delimiter, $line);
list($export_date, ...., $storefront_id ) = explode($delimiter, $line);
Also, if you are only using the query to test for a match based on your condition, don't use SELECT * use something like this:
"SELECT 1 FROM jos_mt_links WHERE link_id='$application_id';"
You could also, as Brandon Horsley suggested, buffer a set of application_id values in an array and modify your select statement to use the IN clause thereby reducing the number of queries you are performing.
Have you tried profiling the code to see where it's spending most of its time? That should always be your first step when trying to diagnose performance problems.
Preprocess with sed and/or awk ?
Databases are built and designed to cope with large amounts of data, PHP isn't. You need to re-evaluate how you are storing the data.
I would dump all the records into a database, then delete the records you don't need. Once you have done that, you can copy those records wherever you want.
As others have mentioned, the expense is likely in your database query. It might be faster to load a batch of records from the file (instead of one at a time) and perform one query to check multiple records.
For example, load 1000 records that match the USD currency and storefront at a time into an array and execute a query like:
'select link_id from jos_mt_links where link_id in (' . implode(',', application_id_array) . ')'
This will return a list of those records that are in the database. Alternatively, you could change the sql to be not in to get a list of those records that are not in the database.

Categories