I've learned something about how the systems of traditional publishing work, since I started in 2012. As a career systems technologist, I've paid particular attention to the data systems, standards, and tools that I've been able to learn about.
While it's a truism that traditional authors who have gone hybrid or converted completely to independent publishing have shared a lot about traditional publishing with the indie community, by the nature of things we don't get a lot of conversations from the techies in the book trade, so it's not terribly easy coming up to speed on the technical systems used by traditional publishers.
Why do I care?
I can't help but notice, whenever I compare my own ebook listings at a retailer with a traditional publisher's listings, that theirs are often cleaner and more complete. Combined with the knowledge that they have large catalogues to maintain, I want to know how that's done, so that I can achieve the same effect. My own catalogue is now 24 titles, so it's not just a matter of data quality but also data quantity.
Managing and Updating Metadata for Many Titles
Whenever I have a bright idea about a better way to manage keywords or categorization, or how to adjust pricing or format book descriptions, I often find myself facing my catalogue and shaking my head about 24 titles times all my distributors.
Today my ebooks are widely distributed. I go directly to Amazon, Barnes&Noble, Kobo, and Smashwords, and I use PublishDrive and StreetLib for Apple and Google Play, and for dozens of international retailers. That's in effect 6 retailer/distributors, for each of whom I might have to update 24 titles.
It would be so much better to have a single database for all my titles and just use that to update all my trading partners, so that they could update their own trading partners or international sites, wouldn't it? My massive spreadsheet can marshal all the data, perhaps, but that just sits on my computer and doesn't communicate with anyone.