27 Kudos
Don't
move!

If you’re building a module that relies on the Drupal cron to do heavy work for you, it might be better idea to split up the task into chunks to share the load.

Let’s say you have a single cron task that has some heavy database work, that could take the cronjob over a minute to execute. Here is a little algorithm you can put into place to help lighten the load on each run.

We need to assume a couple things, but these can be easily adjusted for your case. First we have to assume the table has an incremental index like a primary key, if it doesn’t there is another way to do this, but most tables should have a primary key. Let’s also assume the cron runs once an hour, and each row needs to be affected once per day. Which means we have 24 times to act on every row.

function MODULE_cron() {
  // Setting how many cron runs there are in a day
  $cron_runs = 24;

  // Get the the total number of rows that need to be acted upon
  $rows = db_query("SELECT COUNT(*) FROM {tbl_name}")->fetchField();

  // Rows per cron
  $rpc = $rows / $cron_runs;

  // Get the last row we acted on, else set to 0
  $last_row = variable_get('MODULE_last_row', 0);

  // Set new last row
  $new_last_row = $last_row + $rpc;

  // Get all the rows we want to act on
  $result = db_query("SELECT * FROM {tbl_name} WHERE primary_key > :last_row AND primary_key <= :new_last_row",array(":last_row" => $last_row, ":new_last_row" => $new_last_row);

  foreach($result as $row) {
    // DO WORK
  }

  // Get the new last row, if over the limit, reset to 0
  $new_last_row = ($new_last_row > $rows ? 0 : $new_last_row);

  // Set the new last row that we acted on
  variable_set('MODULE_last_row', $new_last_row);
}

The beauty here is that, even if the list grows over time, it will then increase the amount of rows to act on. It scales well and can help take the load off each cron run. Very usefulif you have a table with a lot of data.