I wanna save mail attachments with size in database.
So I open mail in text mode by php,for attachments I can see something like example:
Content-Type: image/jpeg; name="donoghte D2.jpg"
Content-Disposition: attachment; filename="donoghte D2.jpg"
Content-Transfer-Encoding: base64
X-Attachment-Id: f_gvn2345e0
/9j/4AAQSkZJRgABAQEAYABgAAD/2wBDAAIBAQIBAQICAgICAgICAwUDAwMDAwYEBAMFBwYHBwcG
BwcICQsJCAgKCAcHCg0KCgsMDAwMBwkODw0MDgsMDAz/2wBDAQICAgMDAwYDAwYMCAcIDAwMDAwM ...
I will show it by this code
<?php
header('Content-Type: image/jpeg');
echo (base64_decode($text));
?>
If I wanna to calculate the size of this file,and store it and its size in database,what is the best way?
Should I save encode64 of this(like what sent in mail) in a database?
If so, what should the datatype of that field be?
To calculate size of it, should I decode it, then get strlen of it? or is there any faster way?
with special thanks for your attention
You're dealing with a binary object there, so it's probably best to store it as the same. I'm not sure what database you're using, but MySQL has the BLOB (Binary Large OBject) for this exact purpose.
You could also write it to the file system. There's dozens of good discussions about the merits of both techniques on Stack Overflow, so I won't go into it here (eg: Storing images in DB? Yea or Nay?)
I believe that if you have the decoded data in a string, then strlen would give you the file size of it. You could also query the database or filesystem after storage to get it.
to get a string length in PHP you can use strlen()
format I would use blob which enables you to store binary data in base64_decoded form. But if you don't care about storage capacity and want to resend the attachment, you may store the base64_encoded data (to save processing time) in any text format the DB supports. If you care about DB storage capacity, I would save the file separatedly and store only file name and path into DB.
get file size To get the image length, use strlen on decoded data. It would be better to use also Content-length header.
According to http://us3.php.net/manual/en/function.mb-strlen.php#47309, the following code should give you the string length with multibyte characters counted as 2 characters.
mb_strlen($utf8_string, 'latin1');
However, I would suggest saving the file on disk, as this is usally a lot better performance wise, pro's and con's listed in nickf's post: https://stackoverflow.com/a/8339065/863577
Related
I have multiple text files that are very large, and adding them on MySQL is 100 text = is over 1MB (this is just an example) and I was thinking if is possible to encrypt them so I can make the text shorter so will use less MySQL DB space? and when I'm getting them back from MySQL to be able to decrypt so I can see the real text?
I try to use base_64 and other gzip compress, but all of them is making the size much bigger than original.
How can I compress the text files (encrypt / decrypt)?
you can use InnoDB (engine) compression. As you've asked, it is the same as ZIP compression
Answer is no :) You can't reduce text files using encryption but you can compress text data in database. InnoDB compression example in MySQL
PHP has the ability to manipulate .zip files.
You could save your text into a .zip file, and simply store the filename in the database. This would save a lot of MySQL database space, but you will need some way to generate unique filenames, and somewhere to store those files.
At least they would be zipped, to save as much disk space as possible...
If you want to make DB shorter, you can save large texts as files (on local server, CDN, or remote servers). Keep only filenames in DB and additional information about texts.
In result, you will be able to use the database in your application and read files from hard disks.
I have created a service that hides text inside photographs. For example:
$img_name = "myimage.jpeg";
$orig_contents = file_get_contents($img_name);
$msg = "My secret.";
$fp = fopen($img_name, "wb");
fwrite($fp, $orig_contents . $msg);
fclose($fp);
I'm curious: How much information can I hide inside photographs using this method? For example, could I embed chapters of a novel in an image file? I have added fairly large blocks of text without corrupting the image, but I'm wondering if PHP or image viewing applications impose limits on this.
(P.S. I am aware that this type of steganography is insecure; I'm just doing this for fun.)
You should take a look at Steganography. And be aware you are not hidding your data in the image. Anyone who could open the image with a text editor would see your text somewhere in the file (in this case, in the end, which is much worse). If I were you, I'd do the following:
Encrypt your data with some decent Algorithm and a strong key
Create a function that distributes your data through the file in a pseudo-random way, so that anyone would note that you're trying to put something secret in it (be aware you have to recover it afterwards). In a regular bitmap image, you can use the last bit of each pixel to save your information, since this change made by it would not be perceived by human eye, if you compared the original image with the one that has hidden data.
Pray NSA isn't reading this, otherwise you can get some serious trouble :)
No, there's essentially no limit imposed by either PHP or the JPEG format on how much data you'll be able to add to an image using this method. This works because the JPEG format stores all of the image data at the beginning of the file until some marker. After the marker, any data is assumed to be something else like a thumbnail, for example.
One cool trick (that also works with GIF images) is that you can append a ZIP file to the end of an image and the file works as both a JPEG and a ZIP file. It will be readable by both image processing programs or ZIP programs just by changing the file extension.
I think this is not the most secure way to do it, if you really want to hide string into an image, you will probably use a specific pattern to change a pixel every 10 pixels, the idea is simple convert your image to an array of integer, loop through the array and every 10 pixels change the value to the ascii character number.
Changing 1 each 10 pixel won't make a lot of noise.
To make it more secure use encoding, so use your own map to encode ascii, like #fvdalcin proposed.
I am reading imap email with PHP and downloading attachments, using the core imap PHP functions. I am also connecting via the imap protocol (not POP). I would like to be able to know the filesize of attachments before I load them into memory, to reduce server resource usage.
For example I want a limit of 5mb per attachment, this way I can discard any attachments that are over the limit, before they consume any server resources. Finding the size of the email might help but the problem with that is I want the limit per attachment, not per email.
imap_fetchbody() is loading the attachment, which all works. I could probably read the size of the result of that function but that would mean loading the attachment into memory first.
Thanks
This question is just over 3 years old with no real answer, I'm surprised.
Loading the email structure using imap_fetchstructure(); will reveal all the information you'll need for this task. I will assume you know how to obtain the particular email you want to check before using imap_fetchstructure();.
Anyway, the function will return a list of "parts". These parts are of the IMAP4 specification. The results are given in an array, and one of the array indices is called bytes which indicates the number of bytes for that part. There is another index called encoding which indicates the method of encoding for that part.
As Paul answered, Base64 encoding is generally about 33% larger than the original size which should be good enough to determine the original data length of the attachment.
I believe you can avoid the hassle of loading the data to memory by writing the data directly to a file handle by the use of the imap_savebody(); function. You can overcome the encoding issue by adding a stream filter to that file handle before calling imap_savebody();. This is achieved by the stream_filter_append(); function.
I may edit my answer and add some working PHP code when I have more time.
I realize this is an older post, but in case it helps, the difference in size is due to the attachments being base64-encoded, which results in a roughly 33% expansion of the data. From RFC3501: "[Body size is] A number giving the size of the body in octets. Note that this size is the size in its transfer encoding and not the resulting size after any decoding."
I have a script that gets the raw binary image data via url request. It then takes the data and puts it into mysql.
Pretty simple right? Well It's I'm inserting some 8,000 decent sized 600x400 jpegs and for some odd reason some of the images are getting cut off. Maybe the part of my script that iterates through each image it needs to get is going to fast?
When I do a straight request to the URL I can see all the raw image data, but on my end, the data is cut off some way down the line.
Any ides why?
Is something in the chain treating the binary data as a string, in particular a C style null-terminated string? That could cause it to get cut off at the first null byte ('\0').
Have you tried simply call your script that pulls the binary image, and dump it out. If you see the image correctly then its not pulling part, might be something to do with inserting.
Are you setting the headers correctly?
ie:
header('Content-Length: '.strlen($imagedata));
header('Content-Type: image/png');
...
A string datatype would definitely not be the optimum for storing images in a DB.
In fact I've seen several recommendations that the image should go in a folder somewhere in your filesystem and the DB contains only the address/file path.
This is a link to a page about inserting images.
It contains the suggestion about the filepath and that a blob datatype is better if the images must go in the database.
If it's a blob, then treating it as a string won't work.
If you make repeated requests to the same url, does the image eventually load?
If so that points to a networking issue. Large packet support is enabled in your kernal (assuming linux) which doesn't work correctly for a lot of windows clients. I've seen a similar issue with large(1+MB) javascript libraries served from a linux machine.
http://en.wikipedia.org/wiki/TCP_window_scale_option
http://support.microsoft.com/kb/314053
I'm creating something that includes a file upload service of sorts, and I need to store data compressed with zlib's compress() function. I send it across the internet already compressed, but I need to know the uncompressed file size on the remote server. Is there any way I can figure out this information without uncompress()ing the data on the server first, just for efficiency? That's how I'm doing it now, but if there's a shortcut I'd love to take it.
By the way, why is it called uncompress? That sounds pretty terrible to me, I always thought it would be decompress...
I doubt it. I don't believe this is something the underlying zlib libraries provide from memory (although it's been a good 7 or 8 years since I used it, the up-to-date docs don't seem to indicate this feature has been added).
One possibility would be to transfer another file which contained the uncompressed size (e.g., transfer both file.zip and file.zip.size) but that seems fraught with danger, especially if you get the size wrong.
Another alternative is, if the server uncompressing is time-expensive but doesn't have to be done immediately, to do it in a lower-priority background task (like with nice under Linux). But again, there may be drawbacks if the size checker starts running behind (too many uploads coming in).
And I tend to think of decompression in terms of "explosive decompression", not a good term to use :-)
If you're uploading using the raw 'compress' format, then you won't have information on the size of the data that's being uploaded. Pax is correct in this regard.
You can store it as a 4 byte header at the start of the compression buffer - assuming that the file size doesn't exceed 4GB.
some C code as an example:
uint8_t *compressBuffer = calloc(bufsize + sizeof (uLongf), 0);
uLongf compressedSize = bufsize;
*((uLongf *)compressBuffer) = filesize;
compress(compressBuffer + sizeof (uLongf), &compressedSize, sourceBuffer, bufsize);
Then you send the complete compressBuffer of the size compressedSize + sizeof (uLongf). When you receive it on the server side you can use the following code to get the data back:
// data is in compressBuffer, assume you already know compressed size.
uLongf originalSize = *((uLongf *)compressBuffer);
uint8_t *realCompressBuffer = compressBuffer + sizeof (uLongf);
If you don't trust the client to send the correct size then you will need to perform some sort of uncompressed data check on the server size. The suggestion of using uncompress to /dev/null is a reasonable one.
If you're uploading a .zip file, it contains a directory which tells you the size of the file when it's uncompressed. This information is built into the file format, again, though this is subject to malicious clients.
The zlib format doesn't have a field for the original input size, so I doubt you will be able to do that without simulating a decompression of the data. The gzip format has a "input size" (ISIZE) field, that you could use, but maybe you want to avoid changing the compression format or having the clients sending the file size.
But even if you use a different format, if you don't trust the clients you would still need to run a more expensive check to make sure the uncompressed data is the size the client says it is. In this case, what you can do is to make the uncompress-to-/dev/null process less expensive, making sure zlib doesn't write the output data anywhere, as you just want to know the uncompressed size.