Johan writes of JSON feeds, which sounds like the nifty idea of making your data available as a chunk of JavaScript that can be easily used by anyone. Apparently already does this. The beauty of this is that, since web developers are already writing in JavaScript, they can just include your chunk of JavaScript in their own web applications and do something cool with it. No parsing of XML or other encoding formats required.

I may be missing some key ideas here, but the following are a couple points that spring to mind:

  • What about namespacing issues? Also, the “innovator” seems very susceptible to name changes (or the service provider needs to be “locked in” to the naming choices they have made).
  • How do you ensure that the JSON feeds don’t sneakily embed some functional JavaScript (as opposed to purely data)? You might be able to trust a data source one day, but forever?
  • How do you include JSON feeds dynamically into a web page? Are you forced to use DHTML to construct a <script> tag that references the external JSON feed?
  • What About XML? Most people like to (feel obliged to?) encode their data in XML so that they can leverage existing XML support that is ubiquitous nowadays. Pointless arguments for or against XML aside, this means that if the service provider wants to provide their data in both formats, they have to do extra work. Furthermore, there is no guaranteed payoff for providing it in JSON (Johan mentions user innovation, but this is definitely not a given benefit). Thus, JSON feed support will lag.

    I think to spur adoption of JSON feeds, it would be great if someone wrote some generic code that would convert an arbitrary XML document/node into the equivalent JSON fragment. This could be done in PHP, XSLT or JavaScript. Any and all would be useful. Time to invoke the Lazy Web?

    For instance, given a web service already has a PHP file that provides a bunch of XML data (maybe extracted from a database), which is being used by their application, the web service provider can include the above mentioned XML-to-JSON PHP code and provide a switch in their PHP file that will provide the data as JSON instead. Not a lot of extra work from the data provider’s side since the code was already written for them, but they now provide the JSON feed for other users to possibly pick up and innovate.

    Or given that there already exists some data exposed as XML on another site and I want to use it in my own web application, I can copy the above-mentioned XML-to-JSON JavaScript code to my site, run the foreign XML through this function and then go forward. This saves me the trouble of having to write a parser to handle the XML.

All in all, it seems like a cool idea that I need to investigate further.

§219 · January 31, 2006 · JavaScript, Software, Technology, Web · · [Print]

Leave a Comment to “JSON feeds”

  1. Rob says:

    I read Douglas Crockford’s thing on JSON a while back and I didn’t think too much of it. It’s a handy notation when everyone’s transforming their XML into JS objects anyways, but I don’t think it belongs at that layer. It’s too much like arbitrarily executing anything you’re handed.

  2. You probably noticed my in-depth reply article about these things, but here is a more direct digest I wrote up first prior to making it an entire article:

    1: Naming issues are the same for JSON as for any XML dialect; the day your favourite data providers breaks backwards compatibility with the RSS format they previously committed to, your application breaks. It can be argued that widely adopted schemas such as RSS and Atom are safer bets, but IMO it’s a moot point. As soon as you use somebody else’s data, you are at their mercy of still making it available to you in what format they picked.

    2 & 3: Overcoming the same origin policy browser security model being the main point an strength of JSONP (vs XML), yes, you point your script tag at the enemy and are at their mercy of not giving you any other side effects you did not bargain for. If you don’t trust your feed not to send code with side effects, use either need elevated privileges so you can fetch content via XMLHttpRequest, to use a JSON parser rather than eval. The JSON site has one in javascript, but I’d opt to use some server side script to process and cleanse the feed first if I was in this situation. In which case your choice of picking up just any feed at all and reformatting it as JSONP would be the solution most close at hand.

    4: A generic XML-to-JSONP proxy is a good idea in theory. Chances are it might be as useful to white as to black hat hackers, though I haven’t given it much thought. The “open proxy” idea might not be much of a problem as it’s just about reading data, though it might swamp the script and site with traffic. 🙂

    The way I see it, making JSONP feeds for external consumption is really just interesting when you invite the wide public to innovate around your data. If you don’t, why bother much about it at all?

  3. Jeff Schiller says:

    Thanks, Johan – I didn’t pick up your second post about this until now. Good stuff. Btw, I wasn’t even considering a centralized XML-to-JSONP proxy, more of a here-is-some-code-free-to-use-and-copy type thing. Parsing XML is not hard in any language (just tedious) but getting the conversion proper might prove to be a little challenging … and I’m lazy 😉

  4. Geir Aalberg says:

    > it would be great if someone wrote some generic code that would convert an arbitrary
    > XML document/node into the equivalent JSON fragment

    “Arbitraty” is a pretty dauntint task (given that XML is a pretty complex data structure, especially using namespaces), but XSLT should be as good a tool as any. Plus, you can even do that directly in the browser.

  5. shr3kst3r says:

    I wrote up a quick article on converting RSS feeds to JSON using PHP. You might find it interesting. It is quick and dirty, but it might be useful for a mash-up.