I'm working on a project where we'd want partner sites to get our
content on the other web sites.
A key priority is that they won't have much technical sophistication
(probably no db experience or php experience).
Also, these would be third party sites so there would also be an
issue of requirements (differnet xsl processors,
no allow_url_fopen, etc...)
I tried looking at sites that get their content on other sites to see
how they do it:
the possible solutions seem to be:
1. rss / xml - this is "well known" but would have a pretty high
technical hurdles on the other sites. We could create
every walk through and code sample but I still think it would be too
involved. Only way out is either a xsl transformation (not
going to happen), custom parsing library in php (maybe) or writing to
db with associated libraries (too complex)
2. serialized php with custom library - this seems more feasible than
#1 in terms of requirements
3. php proxy and ajax call / json - i think this would be too complex
and would require too much manipulation of apache to handle
subdomains, etc.....
4. included javascript - kinda how digg handles syndication (see
http://www.digg.com/add-digg for example).
This would seem to be the lowest barrier to entry. The main concern
is preventing unauthorized bots from crawling. It would seem
our only option would be using the referer property but this could be
forged relatively easily.
5. SOAP - no way
6. REST - if meant to just mean xml, i see this as an extension of #1.
any thoughts or opinions would be appreciated.
-jt
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php