Having said all that, and given that there are currently no implementations of mod_changedpage to receive pages from, we will concentrate on working with cloud. What's more, every cloud implementation in the wild uses XML-RPC, so, for simplicity, this is all we will support here. Extending the script to support SOAP and HTTP-POST would not be hard, if necessary, but it probably won't be necessary.
The following Publish and Subscribe (PubSub) system is for web sites displaying various feeds, rendered into XHTML and inserted onto the page using a server-side include. It is currently working live on the test site: http://www.linkpimp.com.
This system also incorporates two other things we have already discussed. First, it renders the feed into XHTML?we can leave the on-screen formatting to a style sheet?but the subroutine that does this is quite obvious in the code and can be turned to do something else?send IM notifications, for example.
Second, it uses Syndic8's API and subscription list service to tell it which feeds to examine. This allows us not only to play with the Syndic8 system, but also to add or remove feeds from the site without touching its code. I can merely log on to Syndic8 and subscribe to a feed there, and it should be incorporated into the site within the hour. It is rough and ready, but you will undoubtedly get ideas of your own from it.
The system comes in two parts:
This script retrieves a subscription list from Syndic8, using that site's XML-RPC interface, checks to see if any of the feeds contain a cloud element, and subscribes, if necessary. It then renders the feeds into XHTML, if necessary, and saves them to disk. It then creates a new file called feeds.shtml, which contains the server-side include instructions to include the feeds. This script is run hourly. Example 12-1, later in this chapter, contains the complete listing.
This script listens out for Publish and Subscribe notifications and refreshes the rendered feed when necessary. This script is run as a daemon. Example 12-2, later in this chapter, contains the complete listing.
Running the system creates three files internal to its working:
The system creates a verbose log, for purposes of explaining what is happening. You can disable this, if you wish.
This file holds the URL and last-subscription-time of all the feeds to which you have Publish and Subscribe subscriptions. It must not be deleted. LinkpimpStrimmer.pl keeps it healthy, by stripping it of URLs older than 24 hours.
This file contains the server-side include instructions for displaying the feeds on your page. In turn, it should be called in with its own SSI directive from your customer-facing page, using the command:
<!--#include file="feeds.shtml" -->