I recently started looking into building web applications using .NET MVC and I stumbled upon this blog post by Phil Haack: JSON Hijacking. For those of you who aren’t aware of this vulnerability when using JSON to transfer sensitive data it’s really a must read.
It seems that there are three ways to handle this vulnerability.
- Require a POST instead of GET in your JSON service.
- Wrap your JSON array responses in a JSON object.
- Don’t expose sensitive data in any service that isn’t protected by 1 or 2.
The third alternative isn’t really an option since it really limits the use of JSON.
So which one of the other two do you prefer?
The .NET MVC 2 preview requires a POST for JSON responses by default, I think this is a great way to protect any developer that doesn’t know about this problem yet. But to me it feels a little "hacky" to break REST in this way. Unless someone talks me out of it I’m sticking to wrapping my arrays in another object and unwrapping it client side.
I personally wrap all my responses in a comment:
and strip that off before JSON.parsing. This makes it useless as a target for script tags.
It’s worth noting that this problem is not only to do with JSON, but any HTTP response you serve that could be interpreted as JavaScript. Even, say, a .htaccess-protected text file is vulnerable to leaking through third-party script tag inclusion, if it’s in a format that happens to be valid JavaScript.
And here’s the crunch: thanks to E4X, even normal, static XML documents are also valid JavaScript. E4X is a disastrous and useless extension to JavaScript, implemented and invented at Mozilla, which allows you to write
<element>content</element>XML literals inline in JS; as such, a protected XML file is now vulnerable to the same cross-site-leakage risks as JSON. Thank you Mozilla. See Google doctype‘s article on this.