LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   php memory limit (https://www.linuxquestions.org/questions/linux-software-2/php-memory-limit-932000/)

Basher52 02-29-2012 03:31 PM

php memory limit
 
I'm an admin at a forum that uses http://www.one.com/en/ as their web hotel.
I want to make a "remote" backup of the mysql database cos that is way faster than to use
phpmyadmin, and likewise with all the web files. (The last one takes over an hour to get)

one.com's php memory_limit is 65M so when I try to run the php script I get the error:
"PHP Fatal error: Out of memory (allocated 16777216) (tried to allocate 21979956 bytes) in /customers/xxxx/httpd.www/forum/admin/backup/backup.mysql.php on line 34"

I tried to use
Code:

ini_set('memory_limit', '256M');
but with this wouldn't work.

First, is there a way to list the memory_limit with a php script?
this so I can first list it, then change it and list it again to see if my value was set or not.

Second, if the above won't work, is there a way to run the script but with a limited memory so it'll be within the limit?

These are the scripts:

The script I run from the web browser;
Code:

<?php
ini_set('track_errors', true);
ini_set('log_errors', true);
ini_set('error_log', dirname(__FILE__) . '/error_log.txt');
ini_set('memory_limit', '256M');

include "backup.mysql.php";
$z=new backupmysql();
echo $z->backup_tables('localhost','database','password','database_name');
?>


The script that makes the backup;
Code:

<?php
ini_set('track_errors', true);
ini_set('log_errors', true);
ini_set('error_log', dirname(__FILE__) . '/error_log.txt');
ini_set('memory_limit', '256M');

class backupmysql
{
/* backup the db OR just a table */
function backup_tables($host,$user,$pass,$name,$tables = '*')
{
  $link = mysql_connect($host,$user,$pass);
  mysql_select_db($name,$link);

  //get all of the tables
  if($tables == '*')
  {
    $tables = array();
    $result = mysql_query('SHOW TABLES');
    while($row = mysql_fetch_row($result))
    {
      $tables[] = $row[0];
    }
  }
  else
  {
    $tables = is_array($tables) ? $tables : explode(',',$tables);
  }

  //cycle through
  foreach($tables as $table)
  {
    $result = mysql_query('SELECT * FROM '.$table);
    $num_fields = mysql_num_fields($result);
    $return = "";

    $return.= 'DROP TABLE '.$table.';';
    $row2 = mysql_fetch_row(mysql_query('SHOW CREATE TABLE '.$table));
    $return.= "\n\n".$row2[1].";\n\n";

    for ($i = 0; $i < $num_fields; $i++)
    {
      while($row = mysql_fetch_row($result))
      {
        $return.= 'INSERT INTO '.$table.' VALUES(';
        for($j=0; $j<$num_fields; $j++)
        {
          $row[$j] = addslashes($row[$j]);
          $row[$j] = ereg_replace("\n","\\n",$row[$j]);
          if (isset($row[$j])) { $return.= '"'.$row[$j].'"' ; } else { $return.= '""'; }
          if ($j<($num_fields-1)) { $return.= ','; }
        }
        $return.= ");\n";
      }
    }
    $return.="\n\n\n";
  }

  //save file
  $handle = fopen('./db-backup-'.time().'-'.(md5(implode(',',$tables))).'.sql','w+');
  fwrite($handle,$return);
  fclose($handle);
}
}
?>


anomie 02-29-2012 04:34 PM

Quote:

Originally Posted by Basher52
First, is there a way to list the memory_limit with a php script?

I believe a simple phpinfo() will do that for you.

resolv_25 03-01-2012 07:56 AM

Yes phpinfo() is solution to see allowed memory.
If you are on shared server, maybe you can't get 256 Mb memory, depends of the hosting.
However, I prefer to put single command in a bash script, and this in cron job. It works faster.
Command may be like this:
mysqldump --user=dbUser --password=dbPassword databaseName > databaseNameBackup.sql

sundialsvcs 03-01-2012 08:40 AM

Don't ask php to do such a thing! :tisk: Don't ask a httpd server to do such a thing! :tisk:

You need a background script, unrelated to the web server, to be performing these operations.

Yes, you can of course write that script in php if you want to ... or in any other programming language that you prefer. That's your choice. It's not the language's fault, nor its limitation. (PHP is every bit as capable as any other full-featured programming language, and, like any of them, it can be used outside of a web-page context.) Rather, it is a highly inappropriate task to be being done by a web server.

Good production systems will have some kind of "batch job" processing system that is capable of farming out tasks to other computers or at least to other processes. You could, if you wish, build a web-page administrative interface to such an external system so that authorized users could start these jobs and monitor their progress.

Even though Unix/Linux does not always ship with a batch-job monitor (unlike the venerable and batch-centric IBM MVS system of yore .. which still exists, by the way .. plenty of good ones are available. You don't even need to "roll your own."

Basher52 03-01-2012 11:18 AM

@anomie: OK, thanks for the phpinfo() I'll keep that in mind.

@resolv_25: Can I really use mysqladmin in php on a server that is hosted by a company? don't you think that is locked down? and if I can, this would be run as a bach shell script from within php, right? The testing here at home is very easy since I can do whatever I like and uses SSH for this.

@sundialsvcs: This I know and while testing on my own test server I saw the php using almost 100%, just some percentage left for mysql :p But I didn't even think that 'resolv's' version would work, and as you see my question to him/her, I still don't :p

Basher52 03-01-2012 01:44 PM

Just tested shell script but as I figured, got this: PHP Warning: shell_exec() has been disabled for security reasons in......

Basher52 03-01-2012 03:08 PM

The other thing I was talking about, copying all web files. This thing takes over an hour to get "home" to my place so I tried to use ZipArchive in php for that.
The max file size of the file created, with a temporary file name though, is 128,00Mbytes and then the php stops with an "Internal Server Error"

What I wonder is, is there a way to create multiple files with smaller size instead with parameters or such?
or do I have to do that myself and check the file size after every file added and then use ::addFile to add more if not "big" enough?

resolv_25 03-02-2012 06:05 AM

Quote:

Originally Posted by Basher52 (Post 4616132)
@anomie: OK, thanks for the phpinfo() I'll keep that in mind.

@resolv_25: Can I really use mysqladmin in php on a server that is hosted by a company? don't you think that is locked down? and if I can, this would be run as a bach shell script from within php, right? The testing here at home is very easy since I can do whatever I like and uses SSH for this.

There are 2 solutions, as sundialsvcs proposed, better put the script out of php.
Typically hosting allows cron jobs option. Just paste this previous code line in cron job line, and pick up the interval of executing.
Second, put this line in a bash script and put your script in a cron job. In a script you may put backups for different databases separately.
This is more simple to handle and process is automatic.

Basher52 03-02-2012 12:39 PM

one.com won't have anything else then FTP or phpMyAdmin, so that sucks.
When I read this just recently I got furious because I had not a CLUE about this.
So now not one bit of change is gonna happen until we move this place.

one.com has no 'cpanel' and no SSH or such for the MySql-DB administration nor all the web files.
For the web files there is only FTP to download them "home" and that sure sucks.

I'm already out looking for other hosts :(

.....after the move I think it'll all be smooth as glass :P


All times are GMT -5. The time now is 06:06 AM.