I use flat files on my website to store account information for my game. But now, gem collecting doesn't work anymore. The filesize of the "accounts.ini" file is 218,750 bytes and the bottom of it looks like this:
[117157336030342728342]
GEMS = 7
[112725622837339591876]
GEMS = "1 4"4"
As you can see, the last line is wrong for some reason. What is causing this? Did I hit the filesize limit?
EDIT: Well, I should have tried it before, but I edited the last line to normal and it started working again. But you're right, I should use a database for such things. I've tried using a database before, but I didn't like it back then, but I guess that's not a good reason not to use it. I was just told that I won't be able to reach the "php.ini" file. I don't think I can find it either. Anyways, I'll switch to a database now, thanks for the advice!
Related
I'm having this issue with the JSONAPI for minecraft. http://mcjsonapi.com/
I am trying to use the method "files.write" or "setFileContents" to replace the contents of a file. The website states this about the method.
Pretty simple. Just rewrites the file right? Yeah but this is proving to be more difficult then I thought. At first attempt, I was trying to set 3450 characters to the file "groups.yml" on the minecraft server. Here's the code I ran in PHP:
var_dump(
$api->call("files.write", array("plugins/GroupManager/worlds/world/groups.yml", (string)$yaml))
);
The var_dump is supposed to either return a success statement or an error describing what went wrong. But instead all I get is "null". This isn't right, and I know $yaml is being casted to a string, which isn't the issue. So I decide to start testing around. After much testing, I find that the character length of what I can set is exactly 1622. Adding another space or anything causes null, otherwise, it works. This is the modified file that is below 1622 characters I tested with.
So great, you found the issue, right? No, I didn't. I thought 1622 was an odd number to stop working, so I did some further testing. I tried to set 3000 characters I generated from just smashing my keyboard, and it worked! So what's going on here?
This and this works, but this doesn't. Why is this? This app called Adminium runs this exact API, and includes a file management system inside the app which I am assuming uses the same methods I am using, but it doesn't have a problem.
I have a forum post here that I also asked on, and still haven't gotten to an answer yet.
I tried using PHP to run 7zip to recursively extract all the zip files that a user had put inside other zip files and then delete the original zip.
The code I used worked except for a larger file (about 7gigs) that had some unusual file types (like hdr and cab files for example) where it did not fully extract the files, made duplicates of some of the ones it did, and then did not delete the original zip. The only thing I saw out of the ordinary about it was that the command prompt I ran the php file from said "Incomplete Extraction". I'm not sure why the extraction and deletion worked for every file but this one.
Any help in understanding this would be greatly appreciated!
Thank you for your time
Here is the code snippet:
$cmd_2 = "FOR /R \"$zip_file_directory\" %I IN (*.zip) DO (7z x \"%I\" -aou -o\"%~dpI\" && del \"%~fI\")";
exec($cmd_2, $out_2, $ret_2);
EDIT
Also it returned a 0 exit code. So again I have no idea what went wrong.
However looking at the $out I can see about 2700 key/value relationships in the array (Example: [2685] => Extracting Client Video\Reviewer\setup.lid)
And at the very end it says "Sub Item Errors: 5" but I can't seem to find a way to find exactly what that means.
EDIT II
I was going through the 2700 lines of code and found a few like this: "[1325] => can not open output file ...." and then has a filename.
Any idea why this is happening so rarely (It looks like these are the 5 errors) out of thousands of lines of extraction?
EDIT III
There is an article here that states that this might be a 7zip issue with firewall, can anyone confirm or deny this?
I have a file (usernames.txt) that has every username of my website members, when they submit their usernames, it will be saved in "usernames.txt", the problem is that there are a lot of users submitting there usernames every day, I want a php code or something that will delete the first (at the top) username every 30 seconds automatically (Even though more than 20 usernames are submitted every minute, the script I need will make the "usernames.txt" file smaller and that will make my server a bit faster. :)
It would be really great if someone has or knows the script I am talking about. :)
Thanks
I would also really, really encourage you to look at http://php.net/manual/en/refs.database.php as #elclanrs suggested. Really encourage you.
Failing that, and a very far second place suggestion, I'd even recommend trying to use serialize and unserialize, so you can read the data in and out of the file quicker. You can then read the data in as an array/object using unserialize, manipulate it, and put it back in the file using serialize.
Further failing that, you would have to read the file into memory (file_get_contents), loop through the lines, and remove the first line, and then write the entire file back out. Alternativly, read it in one line at a time (fopen), skip the first line, write the rest of the file back to a temporary file (fputs), and then swap the files around (rename).
Lastly, assuming you're on a *nix system, you could use something like:
exec("sed '1d' {$my_file_name}", $result);
file_put_contents($my_file_name, $result);
That all being said, you should really look at using a DB. If you don't want a standalone database, you can use sqlite, which will write the database to a local file:
$dbhandle = sqlite_open('db/test.db', 0666, $error);
There are quite a few different threads about this similar topic, yet I have not been able to fully comprehend a solution to my problem.
What I'd like to do is quite simple, I have a flat-file db, with data stored like this -
$username:$worldLocation:$resources
The issue is I would like to have a submit data html page that would update this line based upon a search of the term using php
search db for - $worldLocation
if $worldLocation found
replace entire line with $username:$worldLocation:$updatedResources
I know there should be a fairly easy way to get this done but I am unable to figure it out at the moment, I will keep trying as this post is up but if you know a way that I could use I would greatly appreciate the help.
Thank you
I always loved c, and functions that came into php from c.
Check out fscanf and fprintf.
These will make your life easier while reading writing in a format. Like say:
$filehandle = fopen("file.txt", "c");
while($values = fscanf($filehandle, "%s\t%s\t%s\n")){
list($a, $b, $c) = $values;
// do something with a,b,c
}
Also, there is no performance workaround for avoiding reading the entire file into memory -> changing one line -> writing the entire file. You have to do it.
This is as efficient as you can get. Because you most probably using native c code since I read some where that php just wraps c's functions in these cases.
You like the hard way so be it....
Make each line the same length. Add space, tab, capital X etc to fill in the blanks
When you want to replace the line, find it and as each line is of a fixed length you can replace it.
For speed and less hassle use a database (even SQLLite)
If you're committed to the flat file, the simplest thing is iterating through each line, writing a new file & changing the one that matches.
Yeah, it sucks.
I'd strongly recommend switching over to a 'proper' database. If you're concerned about resources or the complexity of running a server, you can look into SQLite or Berkeley DB. Both of these use a database that is 'just a file', removing the issue of installing and maintaining a DB server, but still you the ability to quickly & easily search, replace and delete individual records. If you still need the flat file for some other reason, you can easily write some import/export routines.
Another interesting possibility, if you want to be creative, would be to look at your filesystem as a database. Give each user a directory. In each directory, have a file for locations. In each file, update the resources. This means that, to insert a row, you just write to a new file. To update a file, you just rewrite a single file. Deleting a user is just nuking a directory. Sure, there's a bit more overhead in slurping the whole thing into memory.
Other ways of solving the problem might be to make your flat-file write-only, since appending to the end of a file is a trivial operation. You then create a second file that lists "dead" line numbers that should be ignored when reading the flat file. Similarly, you could easily "X" out the existing lines (which, again, is far easier than trying to update lines in a file that might not be the same length) and append your new data to the end.
Those second two ideas aren't really meant to be practical solutions as much as they are to show you that there's always more than one way to solve a problem.
ok.... after a few hours work..this example woorked fine for me...
I intended to code an editing tool...and use it for password update..and it did the
trick!
Not only does this page send and email to user (sorry...address harcoded to avoid
posting aditional code) with new password...but it also edits entry for thew user
and re-writes all file info in new file...
when done, it obviously swaps filenames, storing old file as usuarios_old.txt.
grab the code here (sorry stackoverflow got VERY picky about code posting)
https://www.iot-argentina.xyz/edit_flat_databse.txt
Is that what you are location for :
update `field` from `table` set `field to replace` = '$username:$worldlocation:$updatesResources' where `field` = '$worldLocation';
I'm writing a php script where I call
$lines = file('base_list.txt');
to break a file up into an array. The file has over 100,000 lines in it, which should be 100,000 elements in the array, but when I run
print_r($lines);
exit;
the array only contains 7280 elements.
So I'm curious, WTF? Is there a limit on the amount of keys an array can have? I'm running this locally on a dual-core 2.0Ghz with 2GB of RAM (Vista & IIS though); so I'm a little confused how a 4MB file could throw results like this.
Edit:
I have probably should have mentioned that I had previously set memory_limit to 512MB in php.ini as well.
Darryl Hein,
Yeah, there isn't anything in the error logs. I even increased error reporting and still nothing relevant to print_r().
In response to Jay:
I ran
echo count($lines);
and I get a result of 105,546 but still print_r() only displays 7280.
Taking Rob Walker's advice I looped over all the elements in the array and it actually contained all the results. This leads me to believe the issue is with print_r() itself instead of a limit to array size.
Another weird thing is that I tried it on one of my REHL servers and the result was as it should be. Now I don't want to blame this on Windows/IIS but there you go.
With the above in mind I think this question should be re-titled as it's no longer relevant to arrays but print_r.
Edit: As Rob said, if your application is running out of memory, chances are it won't even get to the print_r line. That's kinda my hunch as well, but if it is a memory issue, the following might help.
Take a look in the php.ini file for this line, and perhaps increase it to 16 or more.
memory_limit = 8M
If you don't have access to php.ini (for example if this was happening on a shared server) you can fix it with a .htaccess file like this
php_value memory_limit 16M
Apparently some hosts don't allow you to do this though.
Is it possible there is an inherent limit on the output from print_r. I'd suggest looking for the first and last line in the file to see if they are in the array. If you were hitting a memory limit inserting into the array you would never have gotten to the print_r line.
Two suggestions:
Count the actual number of items in the array and see whether or not the array is the correct number of entries (therefore eliminating or identifying print_r() as the culprit)
Verify the input... any chance that the line endings in the file are causing a problem? For example, is there a mix of different types of line endings? See the manual page for file() and the note about the auto_detect_line_endings setting as well, though it's unlikely this is related to mac line endings.
I believe it is based on the amount of available memory as set in the php.ini file.
every time ive run out of memory in PHP i've received an error message stating that fact. So, I'd also say that if you're running out of memory then the script wouldn't get to the print_r()
try enabling auto_detect_line_endings in php.ini or by using
ini_set('auto_detect_line_endings', 1). There may be some line endings that windows doesn't understand & this ini option could help. more info about this ini option can be found here
You should use count to count the number of items in an array, not print_r. What if output of this large array was aborted because of timeouts or something else? Or some bug/feature in print_r?
PHP's print_r function does have limitations. However, even though you don't "see" the entire array printed, it is all there. I've struggled with this same issue when printing large data objects.
It makes debugging difficult, if you must see the entire array you could create a loop to print every line.
foreach ($FileLines as $Line) echo $Line;
That should let you see all the lines without limitation.
I'm gonna agree with Cory. I'm thinking your PHP is probably configured default memory of 8MB, which 4MB x 2 is already more. The reason for the x2 is because you have to load the file, then to create the array you need to have the file in memory again. I'm just guessing, but that would make sense.
Are you sure PHP isn't logging an error?
If you're outputting to something like Internet Explorer, you might want to make sure it can display all the information you're trying to put there. I know there's a limit to an html page, but I'm not sure what it is.