Hi,
I’ve run into a problem with batch updates. I was trying to batch update a specific collection of items that are digitized books, which is relevant because every item has about 100+ media attached to it… When I run a batch update on these items it throws a out of memory error after completing only the first 100 items. I’ve separately had no problem running batch updates on 10,000+ items that more conventionally have 1 or 2 media attached to it.
Doing some testing, though, I found if it changed the array_chunk in the job from 100 to 25, it runs without issue. So is it possible because there’s so many media attached to these items that 100 is too many to update at once?
Is there a downside to setting the array_chunk to 25?
Or is there some other memory issue at play here?
Thanks,
Joe