Insert icon into mydatabase - php

How can I insert a icon in my database from php through object oriented method ?
table name
ser_icon is my database column name

You must mention what have you done so far to fix the issue. Any solution/codes you can mention. But here is a small help:
Its not recommended to store any kind of image into the database. Rather store it in directories and save the paths into the db for manipulation. However, if you still want to save any image in the db, you need a field with type 'BLOB', which is used to save images/icons in the database.
You can do something like this:
// Image submitted by form. Open it for reading (mode "r")
$fp = fopen($_FILES['file_name']['tmp_name'], "r");
if ($fp) {
$content = fread($fp, $_FILES['file_name']['size']);
fclose($fp);
// Add slashes to the content so that it will escape special characters.
// As pointed out, mysql_real_escape_string can be used here as well. Your choice.
$content = addslashes($content);
// Insert into the table "your_table_name" for column "ser_icon" with our binary string of data ("content")
mysql_query("Insert into your_table_name (ser_icon) Values('$content')");
}

Related

how to convert the file content to byte array with php

I want to save (insert) a uploaded file to a database with PHP, which the type of the database filed is varbinary.
Finally I want to have the content of VarBinary (output) like when file is read in C# and then is stored in byte array and array is inserted to VarBinary.
Also my connection to the database is with sqlsrv.
The type of my files are just PDF and images.
I try this code but my output is different with the output of C#:
$handle=#fopen($_FILES["my_file"]["tmp_name"], 'rb');
$content= file_get_contents($_FILES["my_file"]["tmp_name"]);
$content = unpack("N*",$content);
$content= implode($content);
$sql = "INSERT INTO files (file_data) VALUES (CONVERT(varbinary(MAX)?)";
$params=array();
array_push($params,$content);
$table=sqlsrv_query( $conn, $sql, $params);
"$conn" is the name of my connection that works correctly.
PHP doesn't have a "byte array" data type. What it has is a string type, which is a byte array for all intents and purposes. To read the binary content of a file into a variable which is as close to a byte array as you'll ever get in PHP, do:
$content = file_get_contents($_FILES['my_file']['tmp_name']);
Yup, that's it. Nothing more to do.
I'm not particular familiar with the sqlsrv API, but perusing its documentation it appears that you can (need to?) set a flag this way to flag the data as being binary:
sqlsrv_query($conn, 'INSERT INTO files (file_data) VALUES (?)', array(
array($content, SQLSRV_PARAM_IN, SQLSRV_PHPTYPE_STRING, SQLSRV_SQLTYPE_BINARY)
));
I propose to always convert your binary data to base64.
So it can be stored in database easily and also it can be transfered from somewhere to anywhere with minimum headache!
$handle=#fopen($_FILES["my_file"]["tmp_name"], 'rb');
$elephantContent = file_get_contents($_FILES["my_file"]["tmp_name"]);
$rabbitContent = base64_encode($elephantContent);
//Now ...
$sql = "INSERT INTO files (file_data) VALUES (?)";
sqlsrv_query($conn , $sql , array($rabbitContent) );
file_data field in files table can be varchar, varbinary, blob, text ! :)
Now it can be invoked from database and be packed into a img tag directly.
<img src="data:image/jpeg;base64,PUT `file_data` CONTENTS HERE!" alt="..." />
You can store file type and alt properties in database and put them in tag.
You even can convert it to binary in database (if you insist on seeing binary data in db) (mysql 5.6+)
SELECT FROM_BASE64(`file_data`) as elephant_content from `files` WHERE ...
... And I'm pretty sure that there is equivalent method to do it so in SQL.
For example read this:
https://social.technet.microsoft.com/wiki/contents/articles/36388.transact-sql-convert-varbinary-to-base64-string-and-vice-versa.aspx

PHP CSV upload script, working with new lines?

I'm trying to get a CSV imported into a MySQL database, where each new line should represent a new row in the database.
Here is what I have so far in the CSV:
1one, 1two, 1three, 1four
2one, 2two, 2three, 2four
And in the application:
$handle = fopen($_FILES['filename']['tmp_name'], "r");
$data = fgetcsv($handle, 1000, ",");
$sql = "INSERT INTO tbl (col1, col2, col3, col4) VALUES (?, ?, ?, ?)";
$q = $c->prepare($sql);
$q->execute(array($data[0],$data[1],$data[2],$data[3]))
The problem is that only the first four values are being inserted, clearly due to the lack of a loop.
I can think of two options to solve this:
1) Do some "hacky" for loop, that remembers the position of the index, and then does n+1 on each of the inserted array values.
2) Realise that fgetcsv is not the function I need, and there is something better to handle new lines!
Thanks!
while ($data = fgetcsv($handle, 1000, ",")){
//process each $data row
}
You may also wish to set auto_detect_line_endings to true in php.ini, to avoid issues with Mac created CSVs.
Why would you need a script for this? You can do this in 1 simple query:
LOAD DATA LOCAL INFILE '/data/path/to/file.csv' INTO your_db.and_table
FIELDS TERMINATED BY ', ' /* included the space here, bc there's one in your example*/
LINES TERMINATED BY '\n' /* or, on windows box, probably by '\r\n'*/
(`col1`, `col2`, `col3`, `col4`);
That's all there is to it (in this case, mysql manual will provide more options that can be specified like OPITIONALLY ENCLOSED BY etc...)
Ok, as far as injection goes: while inserting it's -to the best of my knowledge- impossible to be an issue. The data is at no point used to build a query from, MySQL just parses it as varchar data and inserts the data (it doesn't execute any of it). The only operation it undergoes is a cast, type cast to int or float if that turns out to be required.
What could happen is that the data does contain query strings that could do harm when you start selecting data from your table. You might be able to set your MySQL server to escape certain characters for this session, or you could just run a str_replace('``','``',$allData); or something in your script.
Bottom line is: I'm not entirely sure, but the risk of injection should be, overall, rather small.
A bit more can be found here
When it comes to temp files, since you're using $_FILES['filename']['tmp_name'], you might want to use your own temp file: file_put_contents('myLoadFile.csv',file_get_contents($_FILES['filename']['tmp_name']));, and delete that file once you're done. It could well be that it's possible to use the tempfile directly, but I haven't tried that, so I don't know (and not going to try today :-P).

Store BLOB-like data in PostgreSQL

I recently switched from MySQL to PostgreSQL. I have one problem left however.
Previously, I would store small images in the BLOB format in MySQL.
PostgreSQL doesn't know such thing as a BLOB.
I tried using BYTEA field type instead. This actually inserts an large (hexadecimal?) string I guess, but now I'm stuck trying to get this string back to displaying an actual image in PHP..
Any ideas? Thanks in advance.
Here is a piece of code I use to save the image in the database:
$data = bin2hex(file_get_contents('php://input'));
if (!empty($data)) {
$sql = "UPDATE asset SET data = X'%s' WHERE uuid = '%s'";
$args = array($data, $asset_uuid);
}
psql (9.1.3) and php 5.3.6 are used
Bytea is a byte array. It's not a bit pattern. See section 4.2.1.5 of PostgreSQL Lexical Structure.
The correct way to enter bytea is '\x...' with hex values. So what you want is SET data = '\x%s'.
You might also want to look into prepared statements with pg_prepare.
Edit: I was able to insert a (text) file into a bytea with this:
$source = file_get_contents( 'hello.php' );
$insert = pg_prepare( $conn, '', 'insert into t (name, data) values($1,$2)' );
pg_execute( $conn, '', array( 'hello.php', $source ) );
3rd Edit: This works fine to insert the file into the database. However, the pgsql driver in PHP is quite impolite. The only way to retrieve the actual data back is using the old bytea escape mechanism, as detailed here: pg_unescape_bytea.
pg_query('SET bytea_output = "escape";');
$result = pg_query( 'select data from t' );
while ( $row = pg_fetch_row( $result ) ) {
echo pg_unescape_bytea( $row[0] );
}
I'm sorry about how annoying this is. The PostgreSQL interface in PHP can do with some major overhaul for binary values.
To insert bytea contents with the pg_* API, the binary value should always be run through the pg_escape_bytea() function, even if it's passed to the pg_execute or pg_query_params functions.
This is because the pg_* layer doesn't "know" that a particular parameter has binary contents, and it does not implement any real support for parameter types anyway. So the text representation must be used. It can either be in the escape form or the hex form, it doesn't matter to the PG server, and it's independant of the value of bytea_output, which is meaningful only for values read from the server.
Example:
$esc=pg_escape_bytea("\000\001\002");
pg_query_params('INSERT INTO some_table(some_col) VALUES($1)', array($esc));
To read bytea contents with the pg_* API, the value must be run through pg_unescape_bytea() after the fetch. Assuming the client library is not older than 9.0 (libq.so.5.3 or higher), it can decode the contents whether it's in hex form or escape form and it will autodetect it. Only with an older library would it be necessary to force bytea_output to escape for it to decode properly, either dynamically with SET or statically for the whole database (ALTER DATABASE SET bytea_output=escape) or in postgresql.conf for the whole instance.
Example:
$p=pg_query("SELECT some_col FROM some_table WHERE...");
$r=pg_fetch_array($p);
$contents = pg_unescape_bytea($r[0]);
Both answers posted here gave me some thoughts, but none were 100% of the answer.
So, I will explain in this answer what I did to get it to work.
When displaying the image, I used this:
header('Content-Type: image/jpeg');
$data = pack("H*", pg_unescape_bytea($data));
echo $data;
I'm running PHP 5.3.8, in PHP 5.4.0 it turns out you can use hex2bin instead of pack.
When adding the image to the database, I used this:
$data = pg_escape_bytea($data); // Escape input for PostgreSQL
$sql = "UPDATE asset SET data = '%s'WHERE uuid = '%s'";
I'm glad it is working now. Thank you both Daniel and Johann!

optimizing Code for inserting 27000*2 keys from plain text file to DB

I need to insert data from a plain text file, explode each line to 2 parts and then insert to the database. I'm doing in this way, But can this programme be optimized for speed ?
the file has around 27000 lines of entry
DB structure [unique key (ext,info)]
ext [varchar]
info [varchar]
code:
$string = file_get_contents('list.txt');
$file_list=explode("\n",$string);
$entry=0;
$db = new mysqli('localhost', 'root', '', 'file_type');
$sql = $db->prepare('INSERT INTO info (ext,info) VALUES(?, ?)');
$j=count($file_list);
for($i=0;$i<$j;$i++)
{
$data=explode(' ',$file_list[$i],2);
$sql->bind_param('ss', $data[0], $data[1]);
$sql->execute();
$entry++;
}
$sql->close();
echo $entry.' entry inserted !<hr>';
If you are sure that file contains unique pairs of ext/info, you can try to disable keys for import:
ALTER TABLE `info` DISABLE KEYS;
And after import:
ALTER TABLE `info` ENABLE KEYS;
This way unique index will be rebuild once for all records, not every time something is inserted.
To increase speed even more you should change format of this file to be CSV compatible and use mysql LOAD DATA to avoid parsing every line in php.
When there are multiple items to be inserted you usually put all data in a CSV file, create a temporary table with columns matching CSV, and then do a LOAD DATA [LOCAL] INFILE, and then move that data into destination table. But as I can see you don't need much additional processing, so you can even treat your input file as a CSV without any additional trouble.
$db->exec('CREATE TEMPORARY TABLE _tmp_info (ext VARCHAR(255), info VARCHAR(255))');
$db->exec("LOAD DATA LOCAL INFILE '{$filename}' INTO TABLE _tmp_info
FIELDS TERMINATED BY ' '
LINES TERMINATED BY '\n'"); // $filename = 'list.txt' in your case
$db->exec('INSERT INTO info (ext, info) SELECT t.ext, t.info FROM _tmp_info t');
You can run a COUNT(*) on temp table after that to show how many records were there.
If you have a large file that you want to read in I would not use file_get_contents. By using it you force the interpreter to store the entire contents in memory all at once, which is a bit wasteful.
The following is a snippet taken from here:
$file_handle = fopen("myfile", "r");
while (!feof($file_handle)) {
$line = fgets($file_handle);
echo $line;
}
fclose($file_handle);
This is different in that all you are keeping in memory from the file at a single instance in time is a single line (not the entire contents of the file), which in your case will probably lower the run-time memory footprint of your script. In your case, you can use the same loop to perform your INSERT operation.
If you can use something like Talend. It's an ETL program, simple and free (it has a paid version).
Here is the magic solution [3 seconds vs 240 seconds]
ALTER TABLE info DISABLE KEYS;
$db->autocommit(FALSE);
//insert
$db->commit();
ALTER TABLE info ENABLE KEYS;

Capture SQL output in PHP script as timestamped file

I have a mysql query that runs in a php file that outputs a page of data. I need to somehow output that data to a file, to allow me to perform client side functions on that data as needed before exporting it for download.
I need the temporary file named as a timestamp. Im wondering if i should use the fopen to create the file with the name being something like echo date(), then fwrite = $mysql, then fclose?
Is this the correct way to do this?
The SELECT ... INTO OUTFILE statement is intended primarily to let you very quickly dump a table to a text file on the server machine.
Read the documentaiton here...
http://dev.mysql.com/doc/refman/5.0/en/select.html
You could definitely do it that way. If it works for your purposes, then I would consider it correct. Sounds like you've got a good handle on it.
$filename = time() . '.txt';
$fp = fopen($filename,'w');
fputs($fp,$mysql);
fclose($fp);
If you want the filename with actual date numbering instead of a UNIX timestamp, use this instead:
$filename = date('YmdHis').'.txt';
You'll have to get your data into an exportable format of course... the code above assumes that $mysql contains your data and not just a query resource.

Categories