Batch update memory error

Hi,

I’ve run into a problem with batch updates. I was trying to batch update a specific collection of items that are digitized books, which is relevant because every item has about 100+ media attached to it… When I run a batch update on these items it throws a out of memory error after completing only the first 100 items. I’ve separately had no problem running batch updates on 10,000+ items that more conventionally have 1 or 2 media attached to it.

Doing some testing, though, I found if it changed the array_chunk in the job from 100 to 25, it runs without issue. So is it possible because there’s so many media attached to these items that 100 is too many to update at once?
Is there a downside to setting the array_chunk to 25?
Or is there some other memory issue at play here?

Thanks,

Joe

From what you describe I would bet that yes, these items you’re dealing with are just big enough that they’re messing with our assumption that 100 at a time will be fine.

Reducing the chunking size shouldn’t have any negative effect other than some effect on performance/speed of the job, and even that it’s not a given that it would be a negative effect.

Yes, thanks. I’ve actually noticed it runs a little faster at the smaller chunk size. Though that might be in my head because I was getting errors all day.