GET php data to a commandline prompt - php

A PHP application on the server is saving a certain document with a sequential number into a MySQL database. How to obtaion that sequential number to a command line prompt that initiates the local doocument scanner?
ex:
c:\myscan ask_for_current_seq_nmbr.pdf
myscan is something written in c that takes care of the PC stuff. Only the name of file is unknown.
Some code (from the query PHP file)
$query = "SELECT last_seq FROM seq_table WHERE cat = 1";
$result = mysql_query($query, $link) or die('ERROR: '. mysql_error().'<br />ON LINE: '.__LINE__);
while($row = mysql_fetch_assoc($result)) {
echo $row['last_seq'];
}
!!! NOTE !!!
I am fetching a page from a remote server. ex. www.site.com/query.php?q=SELECT * FROM...
And that selection results in the last used sequential number which I would like to use in my command prompt.
!! UPDATE !!
We HAVE to go through a PHP file on the remote server to avoid having to use Remoote MySQL which has to be enabled on an IP basis.

You can call processes that run on the commandline with various function from PHP from the exec familyDocs.
If you're having problems building the actual command string, you can do with:
$cmd = sprintf('c:\myscan %d.pdf', $sequential_number);
As you write that the script is already writing it into the db with the $sequential_number I assume you have it already.
In case the database generates the number, then probably as the primary key. See mysql_insert_idDocs for obtaining the id.

Okay judging by the backslash and the C:\ I am guess you're using windows.
You are going to have to combine the following:
http://dev.mysql.com/doc/refman/5.5/en/mysql.html
How to store the result of a command expression in a variable using bat scripts?
and then to access the content of the variable you created use the %VARIABLE_NAME% syntax.

You should have flag in your mysql table like is_processed with value = 0 or 1.
When scan starts it runs query:
SELECT * FROM TABLE where is_processed = 0 order by sec_number
After processing you should run query:
UPDATE TABLE set is_processed = 1 where sec_number = 'sec_processed_number';

Related

ODBC array fetch of VARBINARY column causes server timeout

I have an app that fetches a VARBINARY(max) data from SQL Server database. On my local environment the app connects via SQL Driver. Connection string of odbc_connect contains:
DRIVER={SQL Server}
I am fetching the VARBINARY data like this:
// Hexadecimal data of attachment
$query = 'SELECT * FROM attachments WHERE LOC_BLOB_ID = ' . $blob_id;
$attach_result = odbc_exec($connection, $query);
odbc_binmode($attach_result, ODBC_BINMODE_CONVERT);
odbc_longreadlen ($attach_result, 262144);
$attach_row = odbc_fetch_array($attach_result);
$hex_data = $attach_row['attachment_value'];
$binary = hex2bin($hex_data);
It works well. Now I need to run this app on a server where my only option is to use the ODBC driver 17 for SQL Server. Connection string contains:
DRIVER={ODBC Driver 17 for SQL Server}
And it doesn't work. It fails on line number 6 of the preview above (on odbc_fetch_array). I've tried commenting out the odbc_binmode and odbc_longreadlen lines (I assumed this driver might handle those data natively), but no luck, same result: Service unavailable timeout error.
Is there a different approach to this width ODBC Driver 17?
Edit: I found out it hangs on ODBC_BINMODE_CONVERT. If I change it to ODBC_BINMODE_RETURN, it runs within few seconds - however the output is wrong. The ODBC_BINMODE_CONVERT is indeed what I need, but it doesn't process the entire data in time (the timeout is 30 seconds), which is strange, because the VARBINARY field in the database is only 65K characters long, and it runs extremely fast on my local environment.
*Edit2: I've tried to convert the incomplete binary data fetched from the database to hexadecimal and then to PNG and it displays half of the image. So I am positive it is fetching the correct data, it just takes incredibly long to fetch that column, resulting in timeouts in almost every case.
OK. Finaly figured it out. What ended up working for me was using ODBC_BINMODE_RETURN flag instead of ODBC_BINMODE_CONVERT, and NOT using hex2bin() conversion at the end.
The code in my original question worked fine with {SQL Server}, and the following code works with {ODBC Driver 17 for SQL Server}:
$query = 'SELECT * FROM attachments WHERE LOC_BLOB_ID = ' . $blob_id;
$attach_result = odbc_exec($connection, $query);
odbc_binmode($attach_result, ODBC_BINMODE_RETURN);
odbc_longreadlen ($attach_result, 262144);
$attach_row = odbc_fetch_array($attach_result);
$binary = $attach_row['attachment_value'];

php sqlite - copy table to another database

I want to copy table from another db file but I fail and I can't get why. This is my code:
$db = new SQLite3($_SERVER['DOCUMENT_ROOT']."/db/098765.db");
$sql = "ATTACH DATABASE 'admin.db' AS admin ;
INSERT INTO 'table-1' SELECT * FROM 'admin.table-1';";
$db->query($sql);
I've read all the questions on this topic on this site, but no answer helped me.
Giving the full path to ATTACH DATABASE doesn't work. Creating table before inserting data also doesn't work.
The sqlite3 command line tool has a handy command called .dump that makes this task trivial:
sqlite3 admin.db '.dump "table-1"' | sqlite3 098765.db
This will create the table, all associated indexes and of course it will copy all the data.
Edit: For a more general solution, create a shell script (let's call it copy-table.sh) as follows:
#!/bin/bash
$src_db="$1"
$dst_db="$2"
$table="$3"
sqlite3 "$src_db" ".dump \"$table\"" | sqlite3 "$dst_db"
Then you can execute the script as follows
./copy-table.sh 'admin.db' '098765.db' 'table-1'
Obviously, you can execute the script anyway you want, e.g. from cron or from php.
Properly quote database object identifiers (table/column names etc) in your INSERT statement. Use use double quotes instead of single ones, which are for string literals. Better yet don't use dashes or other restricted characters in object names if possible (stick with alphanumerics and underscore).
Use exec() instead of query()
$dbPath = $_SERVER['DOCUMENT_ROOT'];
$adminDbPath = $dbPath; // Or something else
$db = new SQLite3("$dbPath/db/098765.db");
$db->exec("ATTACH DATABASE '$adminDbPath/admin.db' AS admin");
$db->exec('INSERT INTO "table-1" SELECT * FROM admin."table-1"');
^^^^ ^ ^ ^ ^
You can get exact copy of table by performing the following set of SQL statements:
(In context of connection to destination database)
attach '<source-db-full-name>' as sourceDb;
select sql from 'sqlite_master' where type = 'table' and name = '<name-of-table>';
// Execute result of previous statement.
// It will create empty table with
// schema identical to schema of source table
insert into '<name-of-table>' select * from sourceDb.[<name-of-table>];
detach sourceDb;

UPDATE statement takes too long

Well I have this problem that I hoped someone could help me with:
So whats it about?
I have a developed PHP script that imports XML files from a folder in to a database.
XML file looks like this- XML file
Basically script stores information from the XML file in to 5 tables, and that works correctly.
But the problem is that my file does not contain ID information of players in the PLAYER object so after I import everything in to database I have to run this query:
$sql = "SELECT igraci.ID, utakmice.Player_ID, utakmice.ID AS broj FROM igraci LEFT JOIN utakmice ON (igraci.Team_ID = utakmice.Team_ID) AND (igraci.Surname = utakmice.Lastname) AND (igraci.Name = utakmice.Firstname);";
$tabela = mysql_query($sql);
$row = mysql_fetch_assoc($tabela);
$totalrow = mysql_num_rows($tabela);
$i=0;
do {
$i++;
$sql = "UPDATE utakmice SET Player_ID=" . $row['ID'] . " WHERE ID = " . $row['broj'] . "";
echo $sql."<br>";
mysql_query($sql);
} while ($row = mysql_fetch_assoc($tabela));
Select statement is executed really fast and I have no problem with that but the UPDATE command is making the script timeout.
I have tryed making the fields used in this QUERY indexes but that didn't help and as soon as I have more than 2200 rows the script fails.
The script was executing ok on older version of php but last month we had to upgrade to 5.3 and thats where the problem started.
Is there any way that I can speed this UPDATE up?
PS: XML file is from FIBA live Cms system.
Is it the php script timing out?
Do you need to do this as a SELECT followed by potentially a large number of updates?
Could you not just use a single UPDATE statement, something like this:-
UPDATE utakmice
INNER JOIN igraci
ON (igraci.Team_ID = utakmice.Team_ID)
AND (igraci.Surname = utakmice.Lastname)
AND (igraci.Name = utakmice.Firstname)
SET utakmice.Player_ID = igraci.ID
Add an INDEX on utakmice.ID to speed up the WHERE part.
If you're not sure about performance run:
EXPLAIN SELECT * FROM utakmice WHERE ID = [x]
See if it's using an index or doing a full table scan (index is good, table scan is slow)
Apart from setting an index on ID you can try batching your updates like explained in here.
You need to prepare a query by concatenating case-whens when neccessary. It's worth a try, but I haven't done any performance tests to see if it could give you a huge boost here.
In the end you'd get something like:
UPDATE utakmice SET title = CASE
WHEN id = <your_first_broj_from_result> THEN <your_first_id_from_result>
WHEN id = <your_second_broj_from_result> THEN <your_second_id_from_result>
...
END
WHERE id IN (<your_first_broj_from_result>, <your_second_broj_from_result>,...)

Using PHP & ODBC_RESULT, Why am I losing the 4th decimal place

I'm hoping this is an easy question.
If I run an ODBC connection via Excel I get exactly what I would expect to see from the database. When I port the query over to Xampp for testing I can not get the query to display the 4th decimal place in the results.
Here is my SQL query defined in Excel:
SELECT MC_BOM_DETAIL.finished_item, MC_BOM_DETAIL.item_num, MC_BOM_DETAIL.quantity, MC_BOM_DETAIL.line_num, IC_INVENTRY_MAST.um_stocking
FROM IC_INVENTRY_MAST IC_INVENTRY_MAST, MC_BOM_DETAIL MC_BOM_DETAIL
WHERE IC_INVENTRY_MAST.company = MC_BOM_DETAIL.company AND IC_INVENTRY_MAST.item_num = MC_BOM_DETAIL.item_num
ORDER BY MC_BOM_DETAIL.finished_item, MC_BOM_DETAIL.line_num
The php page executing the query is as follows
//Define ODBC Connection
$mas_conn = odbc_connect("odbc_connection", "user_name", "password");
//Define Query
$query = "SELECT mbd.finished_item, mbd.item_num, mbd.line_num, mbd.quantity, icm.um_stocking
FROM IC_INVENTRY_MAST icm, MC_BOM_DETAIL mbd
WHERE icm.company = mbd.company AND icm.item_num = mbd.item_num
ORDER BY mbd.finished_item, mbd.line_num";
if($result=odbc_exec($mas_conn, $query)) {
while(odbc_fetch_row($result)){
echo odbc_result($result,'quantity') .'<br>';
}
}
odbc_free_result($result);
odbc_close($mas_conn);
The results are always truncated (not rounded) to the 3rd decimal place, however I need precision to the 4th.
I looked through the php.ini file to see if that maybe the cause but came up short.
Does anybody have an suggestions?
Thank you for your time.
Solution:
The issue in my case was with the Data Definitions inside the source data. This is a bit tricky as the database is read only, and the ODBC driver is provided by 3rd party (ProvideX). After speaking with a tech at the software manufacture, I was able to change the data definitions which was set to decimal(8,3). Why Excel could read the 4th decimal and the php odbc connection could not is still unanswered. My only thought would be since Excel is running locally.
Check You Windows Localization Settings:
Panel Control->Region and Language -> Formats -> Additional Settings -> No of digits after decimal...

Scripting a MySQL query in Unix using daemon in PHP

I'm trying to make an "at" job work at a given time, for testing purposes I'm using $time but this will be a datetime that I get from a separate MySQL query. In the command line I go like this:
echo "mysql -e 'UPDATE admin SET row1=row2 WHERE id=1;'" | at now
And I get a "job 36 at 2010-10-28 15:05". in PHP I tried going like this:
exec("\"mysql -e 'UPDATE admin SET row1=row2 WHERE id=1'\" | at $time");
But the query doesn't run. Worse, I have no idea what is happening.
echo exec('whoami');
returns "daemon" though. How can I echo whatever response I'm getting from the exec command? ideally I guess it would say "job 36 at 2010-10-28 15:05" or something similar.
Also, I have a .my.cnf file in my dir that specifies the db, login and password to use, does the daemon need to have one also for these to work?
[from the answers I can tell I wasn't clear about what I am trying to do. I need to
A. Run a mySQL query to get a date/time and an id
B. Schedule an update to take place to rows that match the id at the date/time
I'd already done "A" and was using "1" for the id and "now" for the time while testing. I'll look into PDO.
I'm not that familiar with the at command, but it seems to help run commands at a certain time.
If you're trying to write a scheduled script in PHP there are better ways to do it. Just write a CLI script and use the cron to schedule it. As Svisstack notes, if you're running MySQL queries use an in-built function such as PDO rather than system commands.
If you're just running systems commands, I'm not sure why you're using PHP ;)
Are you perhaps in safe_mode? If yes then your | is getting escaped automatically. Per the manual:
With safe mode enabled, the command
string is escaped with
escapeshellcmd(). Thus, echo y | echo
x becomes echo y \| echo x.
You can get more information by using the other parameters for the exec command. Try running this and see the output.
$output = array();
$return = false;
$last_line = exec("\"mysql -e 'UPDATE admin SET row1=row2 WHERE id=1'\" | at now", $output, $return);
var_dump($last_line);
var_dump($output);
var_dump($return);
Also, it looks like when you ran it at the command line, you echo'ed the MySQL command, but when you put it in the PHP code, it's not doing the echo anymore. I'm not sure if that makes a difference since I'm not to familiar with at, but I thought that I could offer some troubleshooting help.
$last_line = exec("echo \"mysql -e 'UPDATE admin SET row1=row2 WHERE id=1'\" | at now", $output, $return);
A missing echo in your second statement / exec? (and I'd rather use popen / proc_open the at $time, and fwrite the command for at to execute (after which you close the input stream, then the program. Use atq to verify wether it worked, and be aware the current use may be disallowed at jobs (normally found in files like /etc/at.allow or /etc/at.deny)

Categories