I know, it is a wide question, I just need some hints to solve it.
I wrote some kind of WebApp, using HTML, PHP, SQLSRV (MS SQL Server Express) to collect data from our daily visits at a hospital.
All works fine on my Laptop at home, you can add patients, view details, move them to an archive and so on. HomeEnvironment is PHP 5.6.3, MS-SQL Server 2012 Express. As soon as I copy my *.php - Files onto the network at my company, all screws up: dates aren't sorted right way, some queries work, some doesn't, Buttons disappear - and I have no clue, why this happens. It is - as far as I know - the same environment (besides "full" SQL Server 2012), PHP-Version is same.
I know you can't find a solution, but it would be great, if anyone could give me a hint, where to look for a mistake or where to start troubleshooting. Or if anyone experienced the same and could find a solution.
If any code is needed, feel free to ask!
Related
I recently started web development. The course I took was to install WAMP and start developing right away. I used an atom text editor, this -combined with wamp- proved to be a very fast way to write client-side code(HTML, CSS, Javascript).
But when I started to write serverside PHP things got a little messy. I should probably explain my site's structure here.
I keep separate PHP, CSS, javascript files for every page on the client side, for the server side a have 2 different types of PHP files:
Files that only perform a specific operation on the database(For example returning "5 more answers"). These are always called by AJAX requests.
Files that load the page for the first time. These are only used when the user opens the page for the first time, they do necessary database queries and return the page. Later requests always go to the 1st type of PHP files.
Now regarding my problem. I debugged until now by printing variables to the screen with var_dump() or echoing. But this started to become too slow as the data I work with grew. I wonder if there is a way of debugging which will let me but a breakpoint in one of my PHP files. Then, when I open it on the browser, on the localhost I created using WAMP, will let me go through the PHP file step by step.
I have been dealing with this issue for 3 days, I tried to make it work with Eclipse IDE but couldn't find a way. Also, there seems to be no tutorials or Q&A on the internet regarding the issue.
Breakpoint debugging opens a whole new world, and is the natural step after var_dump() debugging. Not only does it speed up development, but it provides much more information about your code, as you can step through each line and see what values have been set at each step, and how they evolve as your program executes its code. This means you can track the entirety of the values at different stages with one run - imagine tracking all variables at each point using var_dump()!
Although choosing an IDE is a personal decision based on personal taste, i strongly recommend you try out PhpStorm. If you can get a student licence go for it.
PhpStorm has extensive documentation & tutorials on all features in the IDE, debugging is no exception:
https://www.jetbrains.com/help/phpstorm/configuring-xdebug.html
https://www.youtube.com/watch?v=GokeXqI93x8
I don't know of a specific solution to your issue. I'm not exactly sure what you're doing but as a quick tip, I find add the following snippet to the top of the file useful as it will highly error more easily rather than browser just say nope.
error_reporting(E_ALL);
ini_set('display_errors', 'On');
Hope this help you a bit.
I tried out what's recommended in comments and answers. I first tried Netbeans. To be fair it disappointed me. Download kept getting stuck at 100%, even for different versions. When I stopped downloading and went ahead to create a php project, there was missing parts I guess. I couldn't even manage to create a php project. But that might just be me not being able to do it.
Then I followed #leuquim's answer and #Alex Howansky's comment and downloaded PHPStorm. And I got it to work in no more than 20 minutes. I downloaded it with a student's licence. For people who want to use PHPStorm with WAMP here's a Youtube tutorial:
https://www.youtube.com/watch?v=CxX4vnZFbZU
One thing to note in the video is that, maker of the video chooses PHP Web Application in the Run Configurations. That has been changed to PHP Web Page.
What I'm looking for is an easy way to get either individual core usage or total CPU usage for the system that the PHP Script is running on.
However I'm unable to do so. I've looked all over for all manner of solutions from using perf (with and without passthru) to using winmgmts through COM.
The issue is, some of these will work on Windows if you use Apache, but with IIS the security restrictions stop PHP from being able to use for example winmgmts through COM so I just get back a null object.
How can I solve this? - I've honestly tried every solution I can find on the internet and while there is lots of information about how to raise the permissions all the guides point to IIS 7 or earlier and are no longer applicable to IIS 8.5 with literally the suggested option changes being non-existent.
If anyone could help me with this I'd be really appreciative, a workaround like using a third party application that could provide this data would also be acceptable if I can query the data through PHP either from a file or network etc Even a asp.net script that I could query? (I don't know anything about asp.net but I could use it for this single thing if it'd work?)
Thank you.
I managed to solve this and I hope it helps someone else.
What you must do is convert the folder where your PHP (or asp) will execute to an Application. So the structure will look like this:
Website Name
-> Application Name
Then you want to select the parent folder, the Website Name folder and go to "Basic Settings" in the far right actions pane and select "Connect As..." and connect as an Administrator account.
Once you've done this the application will inherent the credentials you specified on the parent website folder and you'll now have full access to perf, wmi and so on.
If you only give the credentials directly to the application it doesn't work and it also doesn't work if you don't convert your folder where your scripts will execute to an application. This is where I was being tripped up and the documentation online is very sparse.
I'd like to thank the good people at the phpsysinfo github for their IIS documentation which pointed me on the right track on needing to convert a site to an application which was part of the puzzle I was missing.
Oke, i've been busting my head on this one.
I'm gonna try and keep things short, however, if you need more info, don't hesitate to ask.
I've written an import repo for an external firm, so we can import their data into our service.
quick overview of implemented logic?
ftp, grab xml file, parse it with simple_xml and do db stuff using laravel eloquent component.
on my dev machine,
every run gets parsed fully and all data is inserted correctly into the database.
problem
when i try the same thing on my production server.
I'm receiving a duplicate entry error, always on the same exact record. (unless i'm using another file)
pre script setup to help detect the error
on each run i do the following:
make sure i'm using the exact same files on both dev and prod
environment... (i've disabled the ftpgrab and uploaded manually to
the correct location)
truncate all the related tables so i'm always
starting with empty! tables.
i've manually triple-zillion checked for duplicates in the xml, but they're not in there.... and the fact that
my dev machine parses the file correctly confirms this.
what i tried
at this point, i've got no more clues as to how i'm supposed to debug this properly.
by now, i've checked so many things (most of them i can't even remember), all of which seemed pretty unrelated to me, but i had to try them.
those things include:
automatic disconnects due to browser
mysql wait timeouts
php script timeouts
memory settings
none of them seem to help (which was exactly what i was expecting)
another fact
my php version on my dev is 5.4.4 and the version on the production server is 5.3.2 (i know this is bad practise, but i'm not using any new features, it's really dead easy code, though it has quite a few lines :-) )
i've been suspecting this to be the cause, but
i've now switched to 5.3.14 on my dev... still the import runs without an issue
the changes from 5.3.2 to 5.3.14 are probably pretty minor
i've tried to manually compile the exact same php version, but i'm to inexperienced to properly do this. moreover, it probably wouldn't have the exact same specs anyway (i think it's pretty impossibly to ./configure exactly the same, considering the use of MacOs vs Ubuntu? especially for a noob like me)
So i've abandoned this path.
I've tried to find the differences in the php versions, but i can't seem to stumble upon anything that might be the cause to all this.
there was a change related to non-numeric keys in arrays (or strings for that matter) in version 5.4.4 (i think) but since i've now come to the conclusion that 5.3.14 also works, this definitely is not the issue. --- looking around insecurely hoping not having said anything downright stupid ---
quick thought while writing this:
the thing is, even though i'm getting the duplicate error statement.
The record did get inserted into the database.
moreover, the error gets triggered when having processed about 2700 (of total 6000) records.
the bound data to the query is actually the data of the second record in the xml file.)
I'm sincerely hoping anyone could put me on the right track for this issue :(
If you made it this far, but don't have a clue about what's going on, thx for reading and sticking to it.
If you might have clue, please enlighten me!
I have 5 cluster servers on Unix[total of 10 unix boxes] and 5 windows servers, which I need to manage. I am looking at making a portal sort of thing, which would give me a snapshot of all the critical details, through the browser.[Mountpoint usages, cluster statuses, Oracle database health, tablespaces info etc ].
How do I go about this? I have apache installed and running on One Unix box. I started out with using the php's exec() command, and planned to use the remsh commands to fetch data from all servers, which I later realised wasn't working out.
How do I go about this?
Any pointers please.
I strongly believe I can get this done through php, but I'm not that familiar with php,but would love to get working on it.
Help please!
remsh is not allowed on my environment, typically!
I would advise that you go with an off-the-shelf solution and not try to write your own. This could be open source or proprietary. I have seen the free version of this tool and it can show much of what you described: Zenoss
Take a look at Nagios:
http://www.nagios.org/
You can also Google for "open source monitoring software"
i'm buiding a survey web application, using Flex for the front-end (nice forms), and a MySQL database for the storage, linked by PHP with the help of ZendAMF.
I largely borrowed from this nice tutorial by Alan Gruskoff :
http://digitalshowcase.biz/wordpress/?page_id=26
(The only one tutorial i've found to work with the last version of Flex).
The app seems to works nicely in my tests, except on certain linux boxes : the data is somehow corrupted : there is no error message, no glitch, but the response of the forms are not what the user selected.
I tried to reproduce the error on a fresh installed ubuntu VM, but it works fine. I've asked friends for some tests, and several linux users showed the same problem, on ubuntu and suse machines, all freshly updated and functionnal.
The application was targeted to be the survey tool for my doctoral thesis, so i'm quite desperate here, and before i'm dumping it to start anew with php only, i'm asking here in case someone can help, thanks :-)
Please excuse my english, by the way.
LJ.
The problem is solved, i'm sorry it was a very dumb error. I was fooled by the client-side complexity, and the testing seemed to point a linux client problem, but it was a type mismatch on the database.
I needed more testing, with a more focused procedure :)