Background:
We have a large (15+ million) table of items which is updated frequently throughout the day (average 300K daily changes). Pending changes are stored in a staging table, and a job runs throughout the day to read changes from this table, make the updates, and mark the changes as processed.
Many other applications use the data in the items table to perform various tasks. Often, these tasks are scheduled and intensive, as they involve comparing live data with old snapshots, and updating other systems accordingly. For example, we list items on eBay, and compare live item data with our existing listings to see whether we need to insert any new eBay listings, remove items we've sold, update quantities, etc. Because the data is so large, most of these applications run infrequently, leaving things out of date much of the time.
My Question:
We are considering implementing a publisher/subscriber pattern using the Service Broker. The goal would be to publish a message when an item changes, which various other systems (like our eBay application) can subscribe to. This would enable us to make more granular updates closer to real-time, rather than large and infrequent updates that involve querying all data, not just what has changed. However, after using the Google it doesn't seem like this is a common database pattern, and that raises red flags. Is this not a valid use of the Service Broker (though I found a small section in Pro Sql Server 2008 Service Broker on doing Pub/Sub)? How is this problem normally solved? It seems like a common enough problem.
TL;DR:
Goal: Update various systems in a dynamic, loosely coupled way when single items change.
Question: Is a pub/sub style solution implemented with Service Broker a viable solution in a high volume setting?