vb.net alternatives to php file handeling functions? - php

i need vb.net equivlants of php fopen, fseek, and fwrite... also, i need to read and write data, not just text. I need to write data at specific byte position to x amount of bytes, as well as read at byte position to x amount of bytes...

Yes look at the System.IO.File class.
use the 'Open' method to get a FileStream. This has the functions your looking for i.e. seek,read,write
http://msdn.microsoft.com/en-US/library/system.io.file_members(v=vs.80).aspx
http://msdn.microsoft.com/en-US/library/system.io.filestream.write(v=vs.80).aspx

Related

PHP line length limits and arrays

Scenario:
I have a php file that I'm using by a zip code lookup form. It has number arrays of five digit zip codes running anywhere from 500 to 1400 zip codes. So far it works but I get PHP sniffer warnings in my code editor (Brackets) that I'm exceeding the 120 character limit.
Question:
Will this stop my PHP from running in certain browsers?
Do I have to go to every 120 characters and do a return just to keep the line length in compliance?
It appears, I need to place these long strings into a database and call them in to the array rather than hang them all inside the PHP.
I am front-end designer so a lot to learn.
<?php
$zip = $_GET['zip']; //your form method is post
// Region 01 - PersonOne Name Zips
$loc01 = array (59001,59002,59003,59004,59006);
// Region 02 - PersonTwo Name Zips
$loc01 = array ("00001","00002","00003","00004","00006");
// Above numeric strings could include 2000 zips
// Region 01 - PersonTwo Name Zips
if (in_array($zip, $loc01)) {
header("Location: https://company.com/personone");
// Region 02 - PersonTwo Name Zips
if (in_array($zip, $loc02)) {
header("Location: https://company.com/persontwo");
Question: Will this stop my PHP from running in certain browsers?
No, PHP runs entirely on the server. Browsers have nothing to do with PHP -- browsers are clients. Languages like HTML, CSS and (most) JavaScript are browser languages, but PHP is only server-side.
Do I have to go to every 120 characters and do a return just to keep the line length in compliance?
No, but I would highly suggest using a database to store tons of records like this. It's exactly what databases are for. Alternatively you could put them in a file and simply read the file in with PHP's file_get_contents function.
I will try to:
Add each array into a mysql database record.
Create a PHP script that fetches each array and applies it to the
respective location.
This will eliminate the bloated lines of arrays numbers in PHP.
BTW, I also need to define these as 5 digit numeric strings as many of the zips start with one or two zeros which are ignored by the POST match.
Thanks everyone for the input.

Is there a limit on the length of command passed to exec in PHP?

Currently I need to merge that 50+ PDF files into 1 PDF. I am using PDFTK. Using the guide from: http://www.johnboy.com/blog/merge-multiple-pdf-files-with-php
But it is not working. I have verified the following:
I have tried the command to merge 2 pdfs from my PHP and it is working.
I have echo the final command and copied that command and paste into command prompt and run manually and all the 50 PDFs are successfully merged.
Thus exec in my PHP and the command to merge 50 PDFs are both correct but it is not working when done together in PHP. I have also stated set_time_limit(0) to prevent any timeout but still not working.
Any idea what's wrong?
You can try to find out yourself:
print exec(str_repeat(' ', 5000) . 'whoami');
I think it's 8192, at least on my system, because it fails with strings larger than 10K, but it still works with strings shorter than 7K
I am not sure if there is a length restriction on how long a single command can be but I am pretty sure you can split it accross multiple lines with "\" just to check if thats the problem. Again I dont think it is... Is there any error output when you try to run the full command with PHP and exec, also try system() instead of exec().
PDFTK versions prior to 1.45 are limited to merge 26 files cuz use "handles"
/* Collate scanned pages sample */
pdftk A=even.pdf B=odd.pdf shuffle A B output collated.pdf
as you can see "A" and "B" are "handles", but should be a single upper-case letter, so only A-Z can be used, if u reach that limit, maybe you script outputs an error like
Error: Handle can only be a single, upper-case letter
but in 1.45 this limitation was removed, changelog extract
You can now use multi-character input handles. Prior versions were
limited to a single character, imposing an arbitrary limitation on
the number of input PDFs when using handles. Handles still must be all
upper-case ASCII.
maybe you only need update your lib ;)

insert HEX in to file with PHP

I have a tool in PHP that I made to automate the process of generating some hex, which I then manually place in a file using a hex editor.
I have 1 byte which goes to offset 0x345, and a much larger section of varying length which goes to 0x560. I use Paste > Write so that the hex I generated replaces what is in it's way rather then increase the size of the file.
Is there a way I can automate this with fopen(); so that I can skip the manual pasting?
You can use these functions: fopen to open file, fseek to desired positions and fwrite your data.

PHP - *fast* serialize/unserialize?

I have a PHP script that builds a binary search tree over a rather large CSV file (5MB+). This is nice and all, but it takes about 3 seconds to read/parse/index the file.
Now I thought I could use serialize() and unserialize() to quicken the process. When the CSV file has not changed in the meantime, there is no point in parsing it again.
To my horror I find that calling serialize() on my index object takes 5 seconds and produces a huge (19MB) text file, whereas unserialize() takes unbearable 27 seconds to read it back. Improvements look a bit different. ;-)
So - is there a faster mechanism to store/restore large object graphs to/from disk in PHP?
(To clarify: I'm looking for something that takes significantly less than the aforementioned 3 seconds to do the de-serialization job.)
var_export should be lots faster as PHP won't have to process the string at all:
// export the process CSV to export.php
$php_array = read_parse_and_index_csv($csv); // takes 3 seconds
$export = var_export($php_array, true);
file_put_contents('export.php', '<?php $php_array = ' . $export . '; ?>');
Then include export.php when you need it:
include 'export.php';
Depending on your web server set up, you may have to chmod export.php to make it executable first.
Try igbinary...did wonders for me:
http://pecl.php.net/package/igbinary
First you have to change the way your program works. divide CSV file to smaller chunks. This is an IP datastore i assume. .
Convert all IP addresses to integer or long.
So if a query comes you can know which part to look.
There are <?php ip2long() /* and */ long2ip(); functions to do this.
So 0 to 2^32 convert all IP addresses into 5000K/50K total 100 smaller files.
This approach brings you quicker serialization.
Think smart, code tidy ;)
It seems that the answer to your question is no.
Even if you discover a "binary serialization format" option most likely even that would be to slow for what you envisage.
So, what you may have to look into using (as others have mentioned) is a database, memcached, or on online web service.
I'd like to add the following ideas as well:
caching of requests/responses
your PHP script does not shutdown but becomes a network server to answer queries
or, dare I say it, change the data structure and method of query you are currently using
i see two options here
string serialization, in the simplest form something like
write => implode("\x01", (array) $node);
read => explode() + $node->payload = $a[0]; $node->value = $a[1] etc
binary serialization with pack()
write => pack("fnna*", $node->value, $node->le, $node->ri, $node->payload);
read => $node = (object) unpack("fvalue/nre/nli/a*payload", $data);
It would be interesting to benchmark both options and compare the results.
If you want speed, writing to or reading from the file system in less than optimal.
In most cases, a database server will be able to store and retrieve data much more efficiently than a PHP script that is reading/writing files.
Another possibility would be something like Memcached.
Object serialization is not known for its performance but for its ease of use and it's definitely not suited to handle large amounts of data.
SQLite comes with PHP, you could use that as your database. Otherwise you could try using sessions, then you don't have to serialize anything, you just saving the raw PHP object.
What about using something like JSON for a format for storing/loading the data? I have no idea how fast the JSON parser is in PHP, but it's usually a fast operation in most languages and it's a lightweight format.
http://php.net/manual/en/book.json.php

How can I use the PHP File api to write raw bytes?

I want to write a raw byte/byte stream to a position in a file.
This is what I have currently:
$fpr = fopen($out, 'r+');
fseek($fpr, 1); //seek to second byte
fwrite($fpr, 0x63);
fclose($fpr);
This currently writes the actually string value of "99" starting at byte offset 1. IE, it writes bytes "9" and "9". I just want to write the actual one byte value 0x63 which happens to represent number 99.
Thanks for your time.
fwrite() takes strings. Try chr(0x63) if you want to write a 0x63 byte to the file.
That's because fwrite() expects a string as its second argument. Try doing this instead:
fwrite($fpr, chr(0x63));
chr(0x63) returns a string with one character with ASCII value 0x63. (So it'll write the number 0x63 to the file.)
You are trying to pass an int to a function that accepts a string, so it's being converted to a string for you.
This will write what you want:
fwrite($fpr, "\x63");
If you really want to write binary to files, I would advise to use the pack() approach together with the file API.
See this question for an example.

Categories