The settingBy conviction and also because it's just common sense IMHO, I refrain from storing too much content (text, photo, code) directly into sites like Facebook (and as some would say "corporate silos") that tend to consider their user's data as their own a bit too easily.
More precisely, the best way to store ones data remains one's own computer (with a tad of backup that is) and that's the main way I store my photos and code.
For data a bit more "endemic" to the Internet (blog posts and selection of photos to be shown), I'm using "free" services built on free software and using open protocols that make it easier to connect several of them together. In this matter, Wordpress, zenPhoto and identi.ca are doing a fairly good job for me.
The aimWell, the aim is quite simple at first sight: make it so that the new posts and photos published on my blog or on identi.ca end up displayed on my Facebook wall.
It should be "simple" since all the required technology exists, is free and utterly widespread with Internet and it HTTP protocol on one hand and on the other hand the RSS feed ubiquitous over the web and allowing tons of applications to "keep in touch" with the new content appearing on so many websites (new post, new photo etc).
The failureSome time ago, Facebook offered to possibility to plug in several RSS feeds whose content would appear on one's wall. It worked ok, and well enough to make a blog post about this pointless anyway. Since then, they limited the number of feeds that could be plugged to just one, which means that one cannot publish content from several sources and has to chose which one will get on facebook (either photos, blog posts, or dents/tweets). The most annoying thing being that even with only one feed the thing just didn't seem to work properly.
So the message was clear: if you want to do such a thing, use a third party application. And I actually tried a renowned one supposed to do exactly what I needed (twitterfeed) but that, despite having several interesting options to filter the feeds' content etc, appeared to be nearly as bad at readeing feeds as facebook.
The solutionI eventually found another application that did exactly what I asked it to (at least so far): RSS Graffiti. This application transcribes the content of several feeds onto a facebook wall. Functionality wise that's about all there is but its exactly what I need and above all: it works !
Going a little bit further...For a little more refinements and to play a little with the feeds' content (modifying the titles and filter some articles), it's possible to use Yahoo Pipes: a very impressive tool which is also quite fun to use.
For those that intend to try, be aware that there is a drawback with using Yahoo pipes with an application like RSS Graffiti (the same would happen with twitterfeed and facebook): Yahoo sets a limit to the number of connection coming from a given IP, and as soon as too many people plug RSS Graffiti on their Yahoo pipes, Yahoo blocks (some of) the connections from RSS Graffiti. Result: RSS Gaffiti is unusable when plugged directly on Yahoo Pipe...
Happily a solution exists but then it becomes a little more hackish, even if there nothing there to impress anybody that has already played with a linux server for instance. What's required is a server serving web pages and where you can set a cron job running wget to fetch the feed from yahoo pipes and save it in a directory accessible from the web. It's the URL of the copied rss file that must be given to RSS Graffiti for the whole thing to work. Also note that one must be sure that the copied rss file is deleted before wget goes looking for another copy (( so the cron job is more like 'rm foo.rss && wget yahoo_pipe_url -O foo.rss' )).
Update: Apparently the rss file generated by Yahoo Pipes does not declares its encoding in the xml header (the first line reads <?xml version="1.0"?> ) but when several feeds, mostly in UTF-8, are joined into a single pipe and when somehow among all the content some non-utf-8 character appears this can cause some interpretation problems on some services (for instance accentuated letters are displayed as "weird" characters). A way to avoid these problems is to make sure that the UTF-8 encoding is explicitly declared, which can be done with a little play of echo and tails, so that the command would become:
rm machin_no_encoding.rss \ && wget yahoo_pipe_url\&_render=rss -O machin_no_encoding.rss \ && echo "<?xml version=\"1.0\" encoding=\"UTF-8\"?>" > machin.rss \ && tail -n +2 machin_no_encoding.rss >> machin.rss