Hello,
This question is a derivative to my post from a year ago which I haven’t been able to solve yet (BatchEdit and ORMInvalidArgumentException)
I am now trying to do a CSV import to revise a single metadata field on 3000+ items that are on the system. These items are of many different types and classes. However I started getting the error below after the first default batch of 20 is done successfully (Number of rows to process by batch in the Advanced Settings Tab)
Doctrine\ORM\ORMInvalidArgumentException: A new entity was found through the relationship 'Omeka\Entity\Resource#resourceClass' that was not configured to cascade persist operations for entity: DoctrineProxies\__CG__\Omeka\Entity\ResourceClass@00000000000005ef0000000000000000. To solve this issue: Either explicitly call EntityManager#persist() on this unknown entity or configure cascade persist this association in the mapping for example @ManyToOne(..,cascade={"persist"}). If you cannot find out which entity causes the problem implement 'Omeka\Entity\ResourceClass#__toString()' to get a clue. in /data/ibali/omeka-s/vendor/doctrine/orm/lib/Doctrine/ORM/ORMInvalidArgumentException.php:114
Stack trace:
#0 /data/ibali/omeka-s/vendor/doctrine/orm/lib/Doctrine/ORM/UnitOfWork.php(3474): Doctrine\ORM\ORMInvalidArgumentException::newEntitiesFoundThroughRelationships()
#1 /data/ibali/omeka-s/vendor/doctrine/orm/lib/Doctrine/ORM/UnitOfWork.php(385): Doctrine\ORM\UnitOfWork->assertThatThereAreNoUnintentionallyNonPersistedAssociations()
#2 /data/ibali/omeka-s/vendor/doctrine/orm/lib/Doctrine/ORM/EntityManager.php(376): Doctrine\ORM\UnitOfWork->commit()
#3 /data/ibali/omeka-s/application/src/Api/Adapter/AbstractEntityAdapter.php(442): Doctrine\ORM\EntityManager->flush()
#4 /data/ibali/omeka-s/application/src/Api/Manager.php(233): Omeka\Api\Adapter\AbstractEntityAdapter->update()
#5 /data/ibali/omeka-s/application/src/Api/Manager.php(136): Omeka\Api\Manager->execute()
#6 /data/ibali/omeka-s/modules/CSVImport/src/Job/Import.php(796): Omeka\Api\Manager->update()
#7 /data/ibali/omeka-s/modules/CSVImport/src/Job/Import.php(424): CSVImport\Job\Import->updateRevise()
#8 /data/ibali/omeka-s/modules/CSVImport/src/Job/Import.php(296): CSVImport\Job\Import->update()
#9 /data/ibali/omeka-s/modules/CSVImport/src/Job/Import.php(194): CSVImport\Job\Import->processBatchData()
#10 /data/ibali/omeka-s/application/src/Job/DispatchStrategy/Synchronous.php(34): CSVImport\Job\Import->perform()
#11 /data/ibali/omeka-s/modules/Log/src/Job/Dispatcher.php(32): Omeka\Job\DispatchStrategy\Synchronous->send()
#12 /data/ibali/omeka-s/application/data/scripts/perform-job.php(66): Log\Job\Dispatcher->send()
#13 {main}
If I increase the batch size to 200, I get the first 200 fine before the error creeps up again.
I have tried disabling some modules as suggested in my previous post - (Numeric Data Type, Bulk Edit) but that has had no effect. I can keep trying to swtich one by one, but I am wondering if there are any tips for what kind of modules might be causing this conflict? if it is that (there are a lot of modules…)
I also followed this link (Batch updating resources with a resource template may throw a UniqueConstraintViolationException · Issue #1690 · omeka/omeka-s · GitHub) but I think the version we have should have that fix in it already. I have tried the problem on both of our prod and dev versions and encountered the same issue:
Omeka S
Version 3.2.1
PHP
Version 8.1.18
SAPI apache2handler
Memory Limit 1G
POST Size Limit 2G
File Upload Limit 1G
Garbage Collection Yes
Extensions apache2handler, bcmath, bz2, calendar, Core, ctype, curl, date, dom, exif, FFI, fileinfo, filter, ftp, gd, gettext, hash, iconv, intl, json, ldap, libxml, mbstring, mysqli, mysqlnd, openssl, pcre, PDO, pdo_mysql, pdo_pgsql, pgsql, Phar, posix, readline, Reflection, session, shmop, SimpleXML, soap, sockets, sodium, SPL, standard, sysvmsg, sysvsem, sysvshm, tokenizer, xml, xmlreader, xmlwriter, xsl, Zend OPcache, zip, zlib
MySQL
Server Version 5.7.42-0ubuntu0.18.04.1
Client Version mysqlnd 8.1.18
Mode ONLY_FULL_GROUP_BY, STRICT_TRANS_TABLES, NO_ZERO_IN_DATE, NO_ZERO_DATE, ERROR_FOR_DIVISION_BY_ZERO, NO_AUTO_CREATE_USER, NO_ENGINE_SUBSTITUTION
OS
Version Linux 4.15.0-213-generic x86_64
and
Omeka S
Version 3.2.3
PHP
Version 8.1.2-1ubuntu2.15
SAPI apache2handler
Memory Limit 1G
POST Size Limit 2G
File Upload Limit 1G
Garbage Collection Yes
Extensions apache2handler, calendar, Core, ctype, date, dom, exif, FFI, fileinfo, filter, ftp, gd, gettext, hash, iconv, imagick, json, libxml, mbstring, mysqli, mysqlnd, openssl, pcre, PDO, pdo_mysql, Phar, posix, readline, Reflection, session, shmop, SimpleXML, sockets, sodium, SPL, standard, sysvmsg, sysvsem, sysvshm, tokenizer, xml, xmlreader, xmlwriter, xsl, Zend OPcache, zlib
MySQL
Server Version 8.0.36-0ubuntu0.22.04.1
Client Version mysqlnd 8.1.2-1ubuntu2.17
Mode ONLY_FULL_GROUP_BY, STRICT_TRANS_TABLES, NO_ZERO_IN_DATE, NO_ZERO_DATE, ERROR_FOR_DIVISION_BY_ZERO, NO_ENGINE_SUBSTITUTION
OS
Version Linux 5.15.0-105-generic x86_64
Otherwise, what would happen if i put the Number of rows to process by batch to something ambitious like 3000? Though it would be great to solve the problem with a more permanent solution.
Many thanks, as always,
Sanjin