OAuth 2.0 drops signatures and cryptography in favor of bearer tokens, similar to how cookies work. As the OAuth 2.0 editor, I’m associated with the OAuth 2.0 protocol more than most, and the assumption is that I agree with the decisions and directions the protocol is taking. While being an editor gives you a certain degree of temporary control over the specification, at the end decisions are made by the group as a whole (as they should).
And as a whole, the OAuth community has made a big mistake about the future direction of the protocol. A mistake that is going to make OAuth 2.0 a much less significant agent of change on the web.
All this Crypto Business
OAuth 1.0 allows application developers to sign requests. Signatures remove the need to send plain-text secrets on insecure (or secure) channels. Instead of sending the secret with the request for the other side to compare against their copy of the secret (similar to how passwords work), the secret is used to calculate a value which cannot be converted back the secret itself. Instead, the signature can be verified by someone with a copy of the secret.
When performing an irreversible calculation that the other side can verify, signatures protect secrets by simply never sending them on the wire. By removing the need to send secrets, applications don’t need to rely on other protocols (such as SSL/TLS) to protect the plain-text secrets. That’s not all signatures provide, but more on that later.
To sign a request, developers have to follow a list of steps in a very specific order and with much care (which often feels like battling a dragon). The smallest mistake causes the entire request to fail. While the OAuth 1.0 signature process could have been somewhat simpler (no double encoding, different sorting, no URI parsing into query parameters, etc.), any time developers need to canonicalize data, stuff breaks. Even beyond the complex math, cryptography is hard because it is generally unforgiving. It does not tolerate mistakes.
WRAP and the Stupidity Threshold
After deploying OAuth 1.0, many companies discovered the cost of supporting OAuth 1.0 due to mismatching signatures. OAuth 1.0 looks simple enough for developer to code from scratch instead of using a library (as opposed to SSL or TLS which no one in their right mind will try to write from scratch). When developers write their own code, they are likely to get one small detail wrong. It didn’t help that the specification was vague and implicit about many important details (which since has been corrected in the RFC).
Ironically, the bigger the company was, the more resources it had, and the least interesting and useful API it offered, the louder the complaining about OAuth signatures was. This had an easy and straightforward solution: provide better libraries to your developers as well as better (or any) debugging tools. Alternatively, make your API so valuable that developers will be motivated to struggle through it and figure it out. Unfortunately, this was not the solution the people behind WRAP had in mind.
At the heart of the WRAP architecture is the requirement to remove any cryptography from the client side. The WRAP authors observed how developers struggled with OAuth 1.0 signatures and their conclusion was that the solution is to drop signatures completely. Instead, they decided to rely on a proven and widely available technology: HTTPS (or more accurately, SSL/TLS). Why bother with signatures if instead the developer can add a single character to their request (turning it from http:// to https://) and protect the secret from an eavesdropper.
Much of the criticism that followed focused on the fact that WRAP does not actually require HTTPS. It simply makes it an option. This use of tokens without a secret or other verification mechanism is called a bearer token. Whoever holds the token gains access. If you are an attacker, you just need to get hold of this simple string and you are good to go. No signatures, calculations, reverse engineering, or other such efforts required.
As Secure As a Cookie
WRAP was based on a simple, and powerful argument: bearer tokens are already a core web architecture. While far from ideal, the WRAP security model was directly based on cookies – the authentication layer behind almost every web application. Why bother to create something more secure if it makes it harder for developers to use, while not actually improving the overall security of the service. As long as a site offers both an OAuth API and a human web interface (i.e. a web site), the overall service will only be as secure as its weakest part – the cookie-based authentication system.
The problem with this argument is not today, but 5 years from now. When trying to propose a new cookie protocol, developers will make the same argument, only this time pointing the finger at OAuth 2.0 as the weakest link. Removing signatures and relying solely on a secure channel solves the immediate problem, and maintain the same existing level of security. But it lacks any kind of forward looking responsibility, and the drive to make the web more secure. It’s a copout.
What makes this more frustrating is that the people behind WARP are some of the brightest security minds on the web. These guys know exactly what they are doing, and it’s not like they don’t care. They just gave up and decided that the best they can do is maintain the status quo. They are also representing a large and powerful coalition of big companies too lazy to work a little harder by helping their developers use signatures successfully.
Doesn’t HTTPS Solve Everything?
HTTPS guarantees an end-to-end secure connection. The implementation and deployment details are critical to ensure that, but when done correctly (which is not always the case), is a great solution. What HTTPS provides is a secure channel. Any secret, password, or bearer token sent over HTTPS is protected and cannot be compromised by an attacker listening in on the line. HTTPS allows a client to send a secret to its desired destination securely.
However, HTTPS can’t help if the client’s desired destination is a bad place. HTTPS doesn’t help prevent phishing attacks because anyone can get an SSL certificate and show the secure icon in the browser. The fact you are using a secure channel doesn’t mean the entity on the other side is good. It just means that no one else can listen in on it (just the bad guys). If a client sends their bearer token to the wrong place, even over HTTPS, it’s game over.
Another issue is that the OAuth working group could not even reach consensus on actually requiring HTTPS, leaving it as an recommendation for services to decide. Even OAuth 1.0 requires HTTPS for its plain-text flavor, which was added to get it published as an RFC. Instead, OAuth 2.0 is satisfied with just a warning. OAuth 2.0’s solution is to allow (but not require) access tokens to be short lived. By limiting the bearer token’s lifetime, stolen tokens are only useful for a short period of time, limiting the potential damage.
Why None of this Matters Today
OAuth today is used together with proprietary web service APIs. There is little to no interoperability across these services (Facebook API used only on Facebook, etc.) and almost no clients performing discovery of any kind. Because the API endpoints are hard coded into the client, when combined with HTTPS, there is no risk of leaking the tokens. In this setup, the client does not need to do much thinking about where to send tokens and how to protect them.
Unlike cookies which are sent to the server based on a somewhat complex set of client rules, OAuth clients today don’t use any rules. Instead they use a single token for an entire service with all API endpoint preconfigured. There are no new subdomains to handle, or really any kind of unexpected or dynamic interaction. In this environment, bearer tokens over HTTPS are just fine.
Why All of this will Matter Soon
As soon as we try to introduce discovery or interoperable APIs across services, OAuth 2.0 fails. Because it lacks cryptographic protection of the tokens (there are no token secrets), the client has to figure out where it is safe to send tokens. OAuth reliance on the cookie model requires the same solution – making the client apply the security policy and figure out which servers to share its tokens with. The resource servers, of course, can ask for tokens issued by any authorization server.
For example, a protected resource can claim that it requires an OAuth access token issued by Google when in fact, it has nothing to do with Google (ever though it might be a Google subdomain). The client will have to figure out if the server is authorized to see its Google access token. Cookies have rules regarding which cookie is shared with which server. But because these rules are enforced by the client, there is a long history of security failures due to incorrect sharing of cookies. The same applies to OAuth 2.0.
Any solution based on client side enforcement of a security policy is broken and will fail. OAuth 1.0 solves this by supporting signatures. If a client sends a request to the wrong server, nothing bad happens because the evil server has no way of using that misguided request to do anything else. If a client sends an OAuth 2.0 request to the wrong server (found via discovery), that server can now access the user’s resources freely as long as the token is valid.
It is clear that once discovery is used, clients will be manipulated to send their tokens to the wrong place, just like people are phished. Any solution based solely on a policy enforced by the client is doomed.
No Discovery for You
Without signatures, OAuth 2.0 cannot safely support discovery. It is a waste of time and a risky business. Clearly, the OAuth community today does not care enough about discovery and interoperable services to do something about it. The cryptographic solutions proposed so far are focused on self-encoded tokens and other distributed systems, based on narrow use cases promoted by the likes of Google, Microsoft, and a few other enterprise-focused companies.
Without discovery, smaller companies will have a harder time getting their services accessible (e.g. when importing your address book from any provider, not just the big four).
I am not advocating throwing OAuth 2.0 out, starting over, or requiring signatures. All I have ever advocated for is the inclusion of a basic signature option in the core specification, in the spirit of OAuth 1.0. The 1.0 signature isn’t perfect, but as the Twitter developer community demonstrated, is clearly within reach.