Dear Lazyweb, sanity check this for me?
Many web pages load resources from other sites, such as images, scripts, etc. This is useful, and fairly harmless. However, all popular browsers send any cookies and http-auth headers along with those cross-site requests. They allow cross-site GET with query string, and cross-site POST, including cookies and http-auth. Does that sound right to you?
almost every single form on the internet is vulnerable to it …
nearly every site is vulnerable to it in multiple places
Attacking pages can have fully logged-in, write-only access to any vulnerable web app. Read access is sometimes possible, if JSONP is used, or XSS vulnerabilities can be exploited.
Major sites such as Gmail and banks have been found vulnerable to CSRF. All web apps are vulnerable, unless designed to avoid CSRF. Many home routers are vulnerable, which can lead to DNS and SSL pwnage just from visiting a hostile web page. Are there any sensitive applications on your work Intranet?
Cross-site cookies are useful for some few applications, such as Google ads and analytics, and Facebook’s ‘0 friends like this page’. For other web apps, they are a hazard.
The problem is with HTTP and browser behaviour, and ideally that’s where we should fix it. It might be more practical to protect web apps, but there are a lot of different web apps.
Disclaimer for the following: I’m not an expert in CSRF or web security.
All cookies must be considered ‘local’ by default. Legitimate cross-site cookies must be declared ‘remote’ when set. Local cookies must not be sent with any cross-site request, where the referring or embedding page is at a different domain.
Http-Auth headers must not be sent in a cross-site request.
Cross-site POST performs an action, and can be harmful even when the user is not authenticated. Most web frameworks treat GET? (with a query string) just like POST, and so most web apps will perform actions in response to a GET? request.
The browser could prevent cross-site POST and strip cross-site GET? query strings, or ask the user whether actions are allowed between those domains. It would be annoying, so web designers would stop using query-strings to access static resources.
Apache mod_rewrite or similar can provide access by simple URL to resources that are behind a query string. This is often done for SEO.
The ‘Referer’ header may be disabled, for privacy. A new header ‘Origin’ can be ‘local’ or ‘remote’, with possibly some other values. Any web app can see if the referring page was at the same domain, without seeing the user’s browsing history.
I’m not sure if a new HTTP version, or a completely new protocol, would be needed to prevent old browsers from fouling things up! It’s easier to change protocols, browsers and servers than to fix every web app on the planet. We should make sure to fix several other browser faults such as XSHM, before switching. Ahahah, I’ve lost my monkeys!!
So, thanks if you read some of this. How crazy am I on a scale of 1 to 10?
I can’t explain why CSRF has not been so very widely exploited. Let me try to explain anyway. Crackers and script-kiddies have such a wide smorgasboard of fun exploits to choose from, where they can take over servers and create bot-nets. It’s not so much fun, this simple web stuff.