I am using php and intervention
What is the db format of a blob?
Is MySQL saving it as base64?
Before I save an image file to db what should I do?
Image::make('......')->encode('data-url');
Is that it?
How to store a Binary Large Object (BLOB)?
A BLOB is a binary large object that can hold a variable amount of
data. The four BLOB types are TINYBLOB, BLOB, MEDIUMBLOB, and
LONGBLOB.
These differ only in the maximum length of the values they can hold.
The four TEXT types are TINYTEXT, TEXT, MEDIUMTEXT, and
LONGTEXT. These correspond to the four BLOB types and have the same
maximum lengths and storage requirements.
Hope the following code will help you:
CREATE TABLE IMAGE_TABLE(
IMG_ID INT(6) NOT NULL AUTO_INCREMENT PRIMARY KEY,
IMG_DETAILS CHAR(50),
IMG_DATA LONGBLOB,
IMG_NAME CHAR(50),
IMG_SIZE CHAR(50),
IMG_TYPE CHAR(50)
);
This will create a table which will suit your requirement.
You may also refer the following SO answers:
Binary Data in MySQL
store TEXT/BLOB in same table or not?
Storing messages as BLOB (Binary Large Object) or ordinary text?
You can refer the official documentation here. This link and this link would be worth a read to deepen your understanding.
Related
I am creating a form that lets users to upload their photos/ video and it will be saved into my database. May I know is there a way to get this done? Im trying to save it in the database using item_path varchar which means it looks. Example: img/cats.jpg . What can I do to make it to be added to the database? Thank you
If you cannot create BLOB column types, then I would use a TEXT type with base64 encoding on the file data. You should take care to record the MIME type of the data and it would probably be a good idea to store a checksum (md5, or similar) as well.
Your files table might look something like this
-- files table
id INT(11) NOT NULL AUTO_INCREMENT PRIMARY KEY
blob MEDIUMTEXT NOT NULL
mime_type VARCHAR(255) NOT NULL
checksum VARCHAR(255) NOT NULL
created_at DATETIME NOT NULL
You might have other columns that associate the file to a user or any other relevant information
Here are the size limits for the MySQL TEXT column types
TYPE SIZE LIMIT
----------------------------------------------------
TINYTEXT (2^8) 256 bytes
TEXT (2^16) 65,536 bytes (64 KiB)
MEDIUMTEXT (2^24) 16,777,216 bytes (16 MiB)
LONGTEXT (2^32) 4,294,967,296 bytes (4 GiB)
When you write to the DB using PHP, use the base64_encode function to encode the file.
If you cannot create TEXT column types, then you can forget saving the actual data in your database. You will then have to save the files to the filesystem and just store a link to the file in your database.
Or lastly, as another comment points out, you could use an external blob storage like Amazon S3 or other cloud pirate services.
SQL Server Image Table
CREATE TABLE "SqlServerTable" (
"id" INT NOT NULL,
"image" IMAGE NOT NULL,
PRIMARY KEY ("id")
);
MySQL Image Table
CREATE TABLE `MySqlTable` (
`id` INT NOT NULL,
`image` LONGBLOB NOT NULL,
PRIMARY KEY (`id`)
);
What is the difference between image and longblob?
How can convert data from image type in SQL Server to blob type in MySQL?
When I copy data from SQL Server to MySQL, does not show any image with this
file_put_contents('2xx.jpg',$MySqlTable['image']);?
I have do special processing on image type SQL Server?
BLOB in MySQL
BLOB values are treated as binary strings (byte strings). They have no character set, and sorting and comparison are based on the numeric values of the bytes in column values.
The four BLOB types are TINYBLOB, BLOB, MEDIUMBLOB, and LONGBLOB. These differ only in the maximum length of the values they can hold.
MySQL Connector/ODBC defines BLOB values as LONGVARBINARY.
image
Variable-length binary data from 0 through 2^31-1 (2,147,483,647) bytes.
image data type will be removed in a future version of Microsoft SQL Server. Use varbinary(max) instead.
Microsoft research: To BLOB or Not To BLOB
IMHO, both are a set of binary data and difference is about how each DBMS stores that set of binary data.
I think a better solution to avoid processing over images, streams and so on is to store URL File Links if there is not any advantage of using in table file data.
I'm learning mysql and am having tremendous trouble with this code to build an image database....
I know how to create a table and I know I need longblob for images. Not a problem. Currently I'm creating via: CREATE TABLE pics
(
picid int unsigned not null auto_increment primary key,
filename varchar(255) not null unique,
caption varchar(255) not null,
pic longblob not null
);
the "not null" in picid is giving me problems. Because next when I attempt to populate using this code:
INSERT INTO pics values
(
NULL,
'bear.jpg',
'a picture of a bear',
LOAD_FILE('C:/Users/USERS_NAME/Pictures/bear.jpg')
);
I get hit with the error #1048 - Column 'pic' cannot be null.
please help. I am losing my mind....
It's not the picid that's the problem. LOAD_FILE('C:/Users/USERS_NAME/Pictures/bear.jpg') most likely fails and returns NULL.
Not to mention, you shouldn't store images in a database. Images are files, and should be stored as such on the file system. The database should hold the metadata + the file's address in the filesystem.
See Effeciently storing user uploaded images on the file system for a good system to follow.
if you still want to continue using the blob method try this tutorial
http://forum.codecall.net/topic/40286-tutorial-storing-images-in-mysql-with-php/
I need to store a very big amount of text in mysql database. It will be millions of records with field type LONGTEXT and database size will be huge.
So, I want ask, if there is a safe way to compress text before storing it into TEXT field to save space, with ability to extract it back if needed?
Something like:
$archived_text = compress_text($huge_text);
// saving $archived_text to database here
// ...
// ...
// getting compressed text from database
$archived_text = get_text_from_db();
$huge_text = uncompress_text($archived_text);
Is there a way to do this with php or mysql? All the texts are utf-8 encoded.
UPDATE
My application is a large literature website where users can add their texts. Here is the table I have:
CREATE TABLE `book_parts` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`book_id` int(11) NOT NULL,
`title` varchar(200) DEFAULT NULL,
`content` longtext,
`order_num` int(11) DEFAULT NULL,
`views` int(10) unsigned DEFAULT '0',
`add_date` datetime DEFAULT NULL,
`is_public` tinyint(3) unsigned NOT NULL DEFAULT '1',
`published_as_draft` tinyint(3) unsigned NOT NULL DEFAULT '0',
PRIMARY KEY (`id`),
KEY `key_order_num` (`order_num`),
KEY `add_date` (`add_date`),
KEY `key_book_id` (`book_id`,`is_public`,`order_num`),
CONSTRAINT FOREIGN KEY (`book_id`) REFERENCES `books` (`id`) ON DELETE CASCADE
) ENGINE=InnoDB DEFAULT CHARSET=utf8
Currently it has about 800k records and weights 4 GB, 99% of queries are SELECT. I have all reasons to think that numbers increase diagrammatically. I wouldn't like to store texts in the files because there is quite heavy logic around and my website has quite a few hits.
Are you going to index these texts. How big is read load on this texts? Insert load?
You can use InnoDB data compression - transparent and modern way. See docs for more info.
If you have realy huge texts (say, each text is above 10MB), than good idea is not to store them in Mysql. Store compressed by gzip texts in file system and only pointers and meta in mysql. You can easily expand your storage in future and move it to e.g. DFS.
Update: another plus of storing texts outside Mysql: DB stays small and fast. Minus: high probability of data inconsistence.
Update 2: if you have much programming resourses, please, take a look on projects like this one: http://code.google.com/p/mysql-filesystem-engine/.
Final Update: according to your info, you can just use InnoDB compression - it is the same as ZIP. You can start with these params:
CREATE TABLE book_parts
(...)
ENGINE=InnoDB
ROW_FORMAT=COMPRESSED
KEY_BLOCK_SIZE=8;
Later you will need to play with KEY_BLOCK_SIZE. See SHOW STATUS LIKE 'COMPRESS_OPS_OK' and SHOW STATUS LIKE 'COMPRESS_OPS'. Ratio of these two params must be close to 1.0: Docs.
If you're compressing (eg. gzip), then don't use TEXT fields of any sort. They're not binary-safe. Data going into/coming out of text fields is subject to character set translation, which probably (though not necessarily) mangle the compressed data and give you a corrupted result when you retrieve/uncompress the text.
Use BLOB fields instead, which are binary-transparent and do not to any translation of the data.
It might be better to define the text field as blob, and compress the data in PHP to save costs in communication.
CREATE TABLE book_parts (
......
content blob default NULL,
......
)
In PHP, use gzcompress and gzuncompress.
$content = '......';
$query = sprintf("replace into book_parts(content) values('%s') ",
mysql_escape_string(gzcompress($content)) );
mysql_query($query);
$query = "select * from book_parts where id = 111 ";
$result = mysql_query($query);
if ($result && $row = mysql_fetch_assoc($result))
$content = gzuncompress($row['content']);
You may also want to use a COMPRESS option to enable compression of packets.
Read some information about this option:
Use Compression in MySQL Connector/Net
Compress Property in dotConnect for MySQL
For PHP I have found this - MYSQLI_CLIENT_COMPRESS for mysqli_real_connect function.
You could use php functions gzdeflate and gzinflate for text.
There are no benefits in compressing large
texts into a database.
Here are the problems you might face in the long run:
If the server crashes the data may be hard to recover.
Not ideal for search.
It takes additional time to transfer the data between the mysql server and the browser.
Time consuming for backup (not using replication).
I think storing these large texts into a disk file will be easier for:
Distributed backup (rsync).
PHP to handle file upload.
I am in the process of migrating a large amount of data from several databases into one. As an intermediary step I am copying the data to a file for each data type and source db and then copying it into a large table in my new database.
The structure is simple in the new table, called migrate_data. It consists of an id (primary key), a type_id (incremented within the data type set), data (a field containing a serialized PHP object holding the data I am migrating), source_db (refers to the source database, obviously), data_type (identifies what type of data we are looking at).
I have created keys and key combinations for everything but the data field. Currently I have the data field set as a longtext column. User inserts are taking about 4.8 seconds each on average. I was able to trim that down to 4.3 seconds using DELAY_KEY_WRITE=1 on the table.
What I want to know about is whether or not there is a way to improve the performance even more. Possibly by changing to a different data column type. That is why I ask about the longtext vs text vs blob. Are any of those more efficient for this sort of insert?
Before you answer, let me give you a little more information. I send all of the data to an insert function that takes the object, runs it through serialize, then runs the data insert. It is also being done using Drupal 6 (and its db_query function).
Any efficiency improvements would be awesome.
Current table structure:
CREATE TABLE IF NOT EXISTS `migrate_data` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`type_id` int(10) unsigned NOT NULL DEFAULT '0',
`data` longtext NOT NULL,
`source_db` varchar(128) NOT NULL DEFAULT '',
`data_type` varchar(128) NOT NULL DEFAULT '',
PRIMARY KEY (`id`),
KEY `migrated_data_source` (`source_db`),
KEY `migrated_data_type_id` (`type_id`),
KEY `migrated_data_data_type` (`data_type`),
KEY `migrated_data_id__source` (`id`,`source_db`),
KEY `migrated_data_type_id__source` (`type_id`,`source_db`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 DELAY_KEY_WRITE=1;
The various text/blob types are all identical in storage requirements in PHP, and perform exactly the same way, except text fields are subject to character set conversion. blob fields are not. In other words, blobs are for when you're storing binary that MUST come out exactly the same as it went in. Text fields are for storing text data that may/can/will be converted from one charset to another.