I’ve managed to import content from several Omeka classic sites into an Omeka-S site that I’m building. The Omeka 2 importer works well. Now I want to move two or three sites from an Omeka-S install into this new one. Is there a way to do what the Omeka 2 importer does when you’re “harvesting” metadata and content from another Omeka-S site?
There is not an importer/aggregator for Omeka S yet. It’s on the Omeka Team’s development roadmap for the coming year. The question is really whether it will be an importer (duplicating the content in a new install) or an aggregator (pulls in pointers from existing installs).
OK. My particular issue is a group of people have put sites up under Omeka-S and then I find I want to pull one or two of them out of that spot and move them to another (where we can focus more particularly on the dynamics of the items in that set of collections). I’m sure that’s an edge case…
You want to move them to a new installation, or just to a new site?
A new installation. The existing (original) installation has maybe 8 sites and they have very little relationship to one another (different staff members using Omeka-S for wildly different purposes) so as one of the site grows (15000+ items) it’s time to move it off to a new installation where all the items in the database more or less relate to the topic at hand. It’s also an installation that we want to put long-term organizational support behind. So moving to a new install and bringing over just a few of those sites makes the most sense.
My workaround has been to clone the original site, then delete everything that’s not related to what I wanted to end up with. That works – as far as it goes. The real problem is I built 2 new sites in a fresh install of Omeka-S (importing and then tweaking a couple of Omeka classic sites). Ideally, I’d have been able to issue a command like “mysqldump” and send an entire Omeka-S site from the original installation (metadata, bitstreams, etc) to a file that I could then import to another Omeka-S site. Come to think of it, that would be very useful for backups so maybe it’s worth considering.
I appreciate the complexity that sort of capability would pose for the the development team.
Ahh… Okay. Well, we’ll put a direct importer on the list. That’s actually not terribly hard with the API if the receiving site is set up with the vocabs and the templates.
The other workaround would be to query the API for the items with python, transform the JSON into a CSV, and the use the CSVImporter to map it to the appropriate templates in the new install. It’s not quite MySQL dump but it’s close.
You can try the module Bulk Import, that has such a tool. New developments will allow any json endpoint soon.