Giter VIP home page Giter VIP logo

nosurf's People

Contributors

aeneasr avatar alexedwards avatar benmanns avatar bfitzsimmons avatar dchest avatar dominikh avatar elithrar avatar justinas avatar lon-io avatar machiel avatar matiasinsaurralde avatar n10v avatar orian avatar paulbellamy avatar wader avatar zepatrik avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nosurf's Issues

Token Length

The current implementation in tokengen.go returns a 44-char string with a single padded character (=) at the end.

It may make more sense to set rawTokenLength to 24, which will result in a 32 character output string when Base64 encoded, or alternatively 33, which will give you a 44 character string with no padding.

This is very clearly nitpicking (!) but may be worth looking at.

Prevent form resubmit

Hi,

Thank you for this library, this works perfectly against CSRF attacks.
However, is there a possibility to use this to combat "CTR+R" (browser resubmits) ?

Ineffective encryption

I don’t understand the purpose of the encryption used in this library. Here is how it is working:

Encryption:

  • Generate a 32-byte token: A
  • Generate a 32-byte key: B
  • Xor the token with the key to "encrypted" it: A ^ B -> C
  • Concatenate the key with the encrypted token: BC
  • Send all 64 bytes to the client as a cookie: BC

Decryption:

  • Accept the 64-byte cookie: BC
  • Split it down the middle to get a 32-byte key: B, and a 32-byte encrypted blob: C
  • Xor the blob with the key to get the decrypted token: C ^ B -> A

This "crypto" is at best a thin layer of obfuscation. It would be logically equivilent to just send the original 32-byte token as a cookie. In fact, this would then be the Double Submit Cookie pattern.

If the intent is to use the Encrypted Token Pattern (described on the same page, and which I think is way more robust than Double Submit Cookies) then the algorithm would have to work like this:

  • Generate a 32-byte token: A
  • Combine it with some other information such as RemoteAddr and a timestamp: ABC
  • Encrypt it with a secret key: f(ABC, S) -> C
  • Give the encrypted blob to the client in a cookie: C

Then when a new request comes in, we:

  • Decrypt the cookie: g(C, S) -> ABC
  • Ensure the decrypted token A agrees with what was supplied in the form or header
  • Ensure the decrypted RemoteAddr B agrees with that of the request
  • Ensure the timestamp C is within a reasonable expiry duration

Is this normal behavior?

Hi,

Thanks for this great package.

To test this I did the following, I'm using JSON on the responses and requests:

I have 2 handlers, both on root "/", one is a GET and the other a POST.
On the GET I only return a token.
On the POST I verify the token and send a new one, like:

...
        // Get token from JSON into jd.

	tkn := nosurf.Token(r)
	if !nosurf.VerifyToken(tkn, jd.Token) {
		w.WriteHeader(http.StatusBadRequest)
		w.Write([]byte(`{"message":"Different tokens"}`))
		return
	}

        // Send the new token.
	w.Write([]byte(fmt.Sprintf(`{"token":"%s"}`, tkn)))

What I would like to know if it is the normal behavior is that after getting
the token from the GET I can do all the requests to POST with that first token
that it always validates ok. I can even do e.g. 10 request to POST with the
first token, next do another request to POST with the newest token sent by
POST, then again start using the first token and it still validates.

Thanks for your help.

ExemptRegexps doesn't work

I'm using some like that
handler := nosurf.New(cleanHandler)
handler.ExemptRegexps("/css(.)", "/js(.)", "/images(.*)")

Exempting my assets, but seems don't work:

Request URL:http://192.168.237.131/js/bootstrap.min.js
Request Method:GET
Status Code:200 OK
Request Headersview source
Accept:/
Accept-Encoding:gzip,deflate,sdch
Accept-Language:es,en;q=0.8,en-CA;q=0.6
Connection:keep-alive
Cookie:csrf_token=WK6UlEqLP3ioDLsUhuQTc1ZZ08DujAS5Gbxv0G2Riow=; _ga=GA1.1.1561828371.1415760157; session=MTQxOTg4Mzk2N3xfVmc3amc5OFh4RW04VUVjekhxLS16SEIwcEpyY0RUZW9EU3lodHdPSk4zUzdnTUpfYlFpR3l0dmM0a182Y0NTNVRMWE5TQ25fNWZhdzAwOHR5MjROYm5vNGoxdDRPNlA1V0FFdU5sZmQ5cm1HWVZidHk4bUg3aDBzVDBwQUhXSFNQb1JlRjdGTndCbms2UTJCN0liM0ZMR0dyRjMyYUlKSWxUVjU3NlhZVWUzaDNsMlZGczJrcnlsd0V5ZVM5SG9pc3RRVjdINk9RRy1PY245aGlkZTdRSnJncWJZelBLT196cHIwSUM0OVVUQThsNXB6NHVOS2g0PXzY2_Q54s1zOKcqBe5NimAmarqUBGrgq6LsWp1kQ28QZg==; flash=MTQxOTg4Mzk3NXxEdi1EQkFFQ180UUFBUkFCRUFBQUJQLUVBQUE9fKrNyW2LmqkQYwTkI9cMXz3dRF2VVQQx2C0LNCx5_UNC
Host:192.168.237.131
Referer:http://192.168.237.131/login
User-Agent:Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.63 Safari/537.36
Response Headersview source
Accept-Ranges:bytes
Cache-Control:public, max-age=300
Content-Encoding:gzip
Content-Type:application/x-javascript
Date:Mon, 29 Dec 2014 20:30:43 GMT
Last-Modified:Thu, 29 Aug 2013 13:52:00 GMT
Transfer-Encoding:chunked
Vary:Cookie
Vary:Accept-Encoding

Allow context to use something other than an in-memory map

I may be missing something, but it appears (in context.go) that the csrf context is designed to use an in-memory map, with no other options to use something like memcache or redis.

I think this would prevent nosurf from being used in an environment where multiple apps are run behind a load balancer, unless something like sticky sessions are employed.

It would be nice if the in-memory map was abstracted out to an interface, so that anyone could plug in alternative stores. Gorilla sessions does this, and it seems to work pretty well.

Token value error

------ context_legacy.go --------- (good)
func Token(req *http.Request) string {
cmMutex.RLock()
defer cmMutex.RUnlock()

ctx, ok := contextMap[req]

if !ok {
	return ""
}

return ctx.token

}

---------- context.go ---------------------(error)
// Token takes an HTTP request and returns
// the CSRF token for that request
// or an empty string if the token does not exist. <----------- ERROR
//
func Token(req *http.Request) string {
ctx := req.Context().Value(nosurfKey).(*csrfContext)

return ctx.token

}

...an empty string if the token does not exist. --> panic occurred !!
PANIC: interface conversion: interface {} is nil, not *nosurf.csrfContext

sorry for poor english.

Broken response with nosurf and gzip middleware

Chrome, Firefox and Safari does not seem to like the response for a failed token verification when using a gzip middware. Chrome reports "This webpage is not available ERR_INVALID_RESPONSE".

What seems to cause the problem is Content-Type: application/x-gzip. In the example this happens because nosurf failure handler does not set any content type and the gzip middleware sets it to application/x-gzip if not set.

Have read parts of the HTTP specs but can't really understand if this is a valid response or not. But most browsers does not like it. Would it make sense to change nosurf.defaultFailureHandler to http.Error(rw, "", http.StatusBadRequest) instead which will set the content type to text/plain; charset=utf-8?

package main

import (
    "net/http"

    "github.com/codegangsta/negroni"
    "github.com/phyber/negroni-gzip/gzip"
)

func main() {
    n := negroni.New()
    n.Use(gzip.Gzip(gzip.DefaultCompression))
    n.UseHandlerFunc(func(rw http.ResponseWriter, r *http.Request) {
        // simulate nosurf.defaultFailureHandler
        rw.WriteHeader(http.StatusBadRequest)
    })
    http.ListenAndServe(":3001", n)
}

Or is this a bug in the gzip middleware? it should not have a fallback content type (Content-Encoding: gzip is enough)?

Employ techniques to mitigate BREACH.

The CSRF token as it is now might be acquired by an attacker using the BREACH technique (assuming the server has compression turned on).

breach-mitigation-rails and django-debreach both take up an interesting approach with this, encrypting the CSRF token with a new random string on each request. It seems like this could be easily applied to nosurf.

Blacklist handlers rather than wrapping all and whitelist some

I have some handlers that use POST, PUT etc that I do NOT want to be protected by nosurf. These are REST API endpoints that don't need CSRF protection as you can't use them without an Auth-Token header anyway.

I can use nosurf like this:

    n := negroni.Classic()

    handler := nosurf.New(mux)
    handler.ExemptPath("/v1")
    handler.ExemptPath("/v1/bulk")
    handler.ExemptPath("/v1/flush")
    n.UseHandler(handler)
    n.Run(fmt.Sprintf(":%d", port))

But now my poor REST API endpoints get a Vary Cookie and a Set-Cookie too.

Can I use nosurf to protect just individual handler functions and now the whole mux thing?

OTP not implemented correctly.

the key needs to be completely secret. in nosurfs implementation the key is raw text pasted to the front of the data.

handler.go: Token appears to be generated twice if not found in cookie

If there is no cookie it appears like the token is generated twice.

First in handle.go on line 85 after ErrNoCookie "h.RegenerateToken(w, r)" generates a token. However realToken remains nil.

Now on line 98 it checks "len(realToken) != tokenLength", but as realToken is not set that will be 0!=32 and thus the token is regenerated on line 99.

Filtering out safe methods and excluded paths

Nosurf excludes safe methods (like GET) and paths (using ExemptPaths) when there is no need to check CSRF token. It's good, but..

In handler.go you have a function ServeHTTP, and after every request even if you are not interested in checking a token, you still do:

  • addNosurfContext(r)
  • w.Header().Add("Vary", "Cookie")
  • tokenCookie, err := r.Cookie(CookieName)
  • realToken = b64decode(tokenCookie.Value)
  • if len(realToken) != tokenLength { ...

.. and this all is useless because then you do:

if sContains(safeMethods, r.Method) || h.IsExempt(r) {
    // short-circuit with a success for safe methods
    h.handleSuccess(w, r)
    return
}

I offer you to move this check to the top of the ServeHTTP function as much as possible, so nosurf can avoid doing useless operations. Performance will be increased

Doubts about many cookies and many Path for a single domain.

Issue opened for the creation of a wiki page that summarizes the doubts and problems for newbies (#52).

Writing #52 I had a doubt:

I need to use surfing.SetBaseCookie(http.Cookie{Path: "/"})?

I don't know if many csrf cookies makes sense if I have these routes:

  • "/"
  • "/login" - server rendered html form with csrf
  • "/logout" - just get which logouts
  • "/api" - json api endpoint POST, GET, so it needs CSRF protection
  • "/oldApiEndpoint"

Why nosurf keep creating cookies for these routes? I think I just need one cookie on the Path: "/" and Domain: ".mysite.com".

Where am I wrong?

Inappropriate key in call to `context.WithValue`

handler_go17_test.go

should not use built-in type string as key for value; define your own type to avoid collisions

image

Description

To prevent clashes across packages using context, the provided key must be similar and should not be of type string or any other built-in type. Users of WithValue must provide their key types.

To avoid allocating when assigning to an interface{}, context keys often have concrete type struct{}. Alternatively, exported context key variables’ static type should be a pointer or interface.

please consider updating tags or deleting the current tag

My co-worker and I were attempting to use nosurf for something. We encountered some issues and once we started looking at them we realized glide picked up the 0.1 tag from 4 years ago rather than tip.

Not a big deal since we should have looked more closely when we added the dep via glide but please consider tagging stables release more frequently or removing the existing tag.

Combining Session and CSRF cookie

Background: The current system of open reads of the CSRF cookie by Javascript also means that valid CSRF tokens can be stolen and re-used infinitely(?) assuming 1) the route does not require auth/session data or 2) a valid session was also stolen.

Thought experiment: I'm contemplating changes to combine the two separate cookies (session and CSRF) into a single cookie. This would reduce the load on the randomness source and reduce bandwidth between the server and client.

Currently however, the CSRF is Javascript-readable which is an attack vector for the session. To prevent malicious client-side javascript from stealing the OTP CSRF token (and thereby knowing how to generate the true token to compare with it) two changes seem required:

  1. Main CSRF cookie token should be httpOnly, Strict, and Secure (HTTPS-only)
    by default: https://github.com/justinas/nosurf/blob/master/handler.go#L204
  2. Form/Header OTP+OTP(Token) should be hashed with server secret OTP+OTP(hash(secrete, token))

This means that if Javascript were to retrieve/steal the form/header token they would
be unable to construct the original HTTP-only cookie token for the session itself. (This is important if the CSRF cookie was also used as the session cookie)

Remove Referer check

A HTTP Referer [sic] check was added in commit b1b164f for HTTPS sites. Its motivation is:

+       // if the request is secure, we enforce origin check
+       // for referer to prevent MITM of http->https requests

I doubt that this check will add much value. The token is not secret, it just needs to be unpredictable. Since httpOnly is not set by default, the cookie can also be leaked through XSS.
An open redirect vulnerability would also bypass this check.

There is at least one case where this breaks, when (Firefox) users disable the Referer header. What about using the Secure cookie flag instead? In that way, a different token will be sent to the http and https origins and even a passive adversary cannot see the token of the secure origin.

Related comment: #11 (comment)

Edit:
even with the Secure flag set, it would not help with existing cookies (which have a default lifetime of one year). Any cookies injected into a HTTP origin will also be sent to a HTTPS one.

A possible alternative for the referrer check here is the use of the SameSite attribute, although it will have the same problem as above (HTTP cookies can still be injected into HTTPS).

Assuming a passive adversary, a possible mitigation is the inclusion of the scheme in the cookie name or value. When the scheme does not match, treat it as invalid and send a new cookie (with the Secure flag set).

RegenerateToken generates two CSRF cookies when no previous CSRF cookie was set

Calling RegenerateToken() in a request context where the client is not sending a CSRF cookie, two CSRF cookies will be generated:

map[Set-Cookie:[csrf_token=aZA5CKCpmzGwlyfyFZp1akOOo4dSbZEdSAziaN+nRYE=; Path=/; Domain=example.com; Max-Age=31536000; HttpOnly; Secure csrf_token=xe/JUh5YavyzQtmIqU018swoHmPN5nQsTSqSJscKJU4=; Path=/; Domain=example.com; Max-Age=31536000; HttpOnly; Secure] Vary:[Cookie]]

Depending on the order of the browser stores the cookie, this can lead to false-positive CSRF detection.

Ability to handle multiple cookies in context

Is there an easy way to handle the scenario of multiple cookies in context and not just the default csrf_token ?

It's regarding an iframe integration scenario that the same frame will be included multiple times in the page and the way the library is now, each frame will overwrite the csrf_token meaning if the 1st form submits then it will have a different token than the latest in the context.

For setting the tokens with different names I've managed to simply append the frameId inside the HandlerFunc but the problem is in the verification step where the context is the same.

example is insecure

This may be a bit silly since I don't think it is intended usage, but I find it kind of odd that by default nosurf doesn't actually verify that the CSRF token is generated by the server, just that it is valid so it is relatively easy to bypass.

Anyway, the example code says...

<!-- Try removing this or changing its value
     and see what happens -->

OK challenge accepted:

==> GET
ignoring csrf token prnkNlDJPKrNQztxLZ41TnTV9ILVRhpbfTatqo/qbF0= -- posting with cookie MDEyMzQ1Njc4OTAxMjM0NTY3ODkwMTIzNDU2Nzg5MDE= X-CSRF-TOKEN header guBSEnDYkGLIhRVVJKnMdUfrx57ipBW2YhecPtC1ni+y0WAhRO2mVfC8JWQWmvhAcdz/p9KVJ4VWIqoJ6IyuHg== instead
==> POST
response: http status code 200

So I am able to bypass the CSRF protection by just generating my own token and removing the one from the form. Again, I realize maybe this is possibly a silly scenario but I was challenged to break the example code and I did. Client code follows:

package main

import (
	"bytes"
	"crypto/rand"
	"encoding/base64"
	"fmt"
	"io"
	"net/http"
	"net/url"
	"strings"
	"time"
)

const (
	tokenLength = 32
)

func main() {
	client := &http.Client{}

	selfTokenBytes := []byte("01234567890123456789012345678901")
	selfCookieToken := b64encode(selfTokenBytes)
	selfHeaderToken := b64encode(maskToken(selfTokenBytes))
	r := bytes.NewReader([]byte(""))

	fmt.Printf("==> GET\n")
	req, err := http.NewRequest("GET", "http://127.0.0.1:8000", r)
	resp, err := client.Do(req)
	if err != nil {
		fmt.Printf("err: %v\n", err)
		return
	}
	s := strings.Split(resp.Header["Set-Cookie"][0], ";")
	ss := strings.SplitN(s[0], "=", 2)
	fmt.Printf("ignoring csrf token %v -- posting with cookie %v X-CSRF-TOKEN "+
		"header %v instead\n", ss[1], selfCookieToken, selfHeaderToken)

	fmt.Printf("==> POST\n")

	form := url.Values{}
	form.Add("name", "jolan")

	req2, err := http.NewRequest("POST", "http://127.0.0.1:8000", strings.NewReader(form.Encode()))
	req2.Header.Add("X-CSRF-Token", selfHeaderToken)
	expiration := time.Now().Add(365 * 24 * time.Hour)
	cookie := http.Cookie{Name: "csrf_token", Value: selfCookieToken, Expires: expiration}
	req2.AddCookie(&cookie)

	resp2, err := client.Do(req2)
	if err != nil {
		fmt.Printf("err: %v\n", err)
		return
	}

	fmt.Printf("response: http status code %v\n", resp2.StatusCode)
}

func b64encode(data []byte) string {
	return base64.StdEncoding.EncodeToString(data)
}

// Masks/unmasks the given data *in place*
// with the given key
// Slices must be of the same length, or oneTimePad will panic
func oneTimePad(data, key []byte) {
	n := len(data)
	if n != len(key) {
		panic("Lengths of slices are not equal")
	}

	for i := 0; i < n; i++ {
		data[i] ^= key[i]
	}
}

func maskToken(data []byte) []byte {
	if len(data) != tokenLength {
		fmt.Printf("%v != %v\n", len(data), tokenLength)
		panic("data != tokenLength")
	}

	// tokenLength*2 == len(enckey + token)
	result := make([]byte, 2*tokenLength)
	// the first half of the result is the OTP
	// the second half is the masked token itself
	key := result[:tokenLength]
	token := result[tokenLength:]
	copy(token, data)

	// generate the random token
	if _, err := io.ReadFull(rand.Reader, key); err != nil {
		panic(err)
	}

	oneTimePad(token, key)
	return result
}

CSRF failed with bad request

I am following Trevor Sawler's Golang course, and hit a roadblock with section 8.2. I have detailed the issue here. Could someone please help me out? I have followed the instructions carefully, but it doesn't work as seen in the video. Many thanks!

nosurf breaks MultipartReader()

First, thank you for fixing enctype="multipart/form-data".

However, now processing a form with reader, err := r.MultipartReader() results in the following error: http: multipart handled by ParseMultipartForm.

Using MultipartReader() is advantageous for my use case because it allows me to process the request body (most importantly the file uploads) as a stream.

Use golang.org/x/net/context instead of gorilla based context

The gorilla based context used in nosurf needs mutex locking and garbage collection. golang.org/x/net/context is a better context implementation with zero garbage collection created by go team. It'd be good to see nosurf migrating to x/net/context.

Token can be stored in the context like this:

type csrfTokenKey struct{}
...
// in the handler
c = context.WithValue(c, csrfTokenKey{}, token)

And retrieved by:

func ContextCSRFToken(c context.Context) (string, bool) {
    v, ok := c.Value(csrfTokenKey{}).(string)
    return v, ok
}

"csrf_token" cookie being generated on exempted routes

I am trying to exempt a few routes from csrf but noticed a "csrf_token" cookie still gets generated on those routes. Doesn't seem necessary to have that cookie on exempted routes. Also, is that cookie necessary after a form has been successfully transmitted?

An example with only 1 route that is supposed to be exempted from csrf tokens:

package main

import (
    "github.com/gorilla/mux"
    "github.com/justinas/nosurf"
    "log"
    "net/http"
)

type Routes []Route
type Route struct {
    Method      string
    Pattern     string
    HandlerFunc http.HandlerFunc
}

func mainHandler(w http.ResponseWriter, r *http.Request) {

}

func main() {
    var routes = Routes{
        Route{"GET", "/mypath", mainHandler},
    }
    router := mux.NewRouter().StrictSlash(true)
    for _, route := range routes {
        handler := route.HandlerFunc
        router.Methods(route.Method).Path(route.Pattern).Handler(handler)
    }

    // csrf protection
    csrfHandler := nosurf.New(router)
    csrfHandler.ExemptPath("/mypath")

    port := ":8080"
    log.Println("Listening at", port)
    log.Fatal(http.ListenAndServe(port, csrfHandler))
}

Remove examples folder

Using gvt (and presumably other tools that support automatic recursive dependency vendoring) leads to a bunch of dependencies being unnecessarily added when vendoring nosurf.

$ gvt fetch github.com/justinas/nosurf
2016/05/23 19:44:37 fetching recursive dependency github.com/hoisie/web
2016/05/23 19:44:41 fetching recursive dependency github.com/zenazn/goji
2016/05/23 19:44:46 fetching recursive dependency golang.org/x/net/websocket

It looks like this is because of the examples folder being in the main repository.

Renaming the examples directory to _examples would fix this for users of gvt. I'm unsure about other vendoring tools.

My gut feeling is it would be cleaner and safer to move these examples out of the main repository (where they're not actually needed) though. Perhaps to gists and referencing them from the readme instead? I'm happy to do that and send a PR.

Logging

How would I log when somebody does send a request with a bad token?

How does nosurf OTP protect against BREACH?

The BREACH paper states

In order for the attack to be successful,several things are required. To be vulnerable to this side-channel, a web app must:

  • Be served from a server that uses HTTP-level compression
  • Reflect user-input in HTTP response bodies
  • Reflect a secret (such as a CSRF token) in HTTP response bodies

Additionally, while not strictly a requirement, the attack is helped greatly by responses that remain mostly the same modulo the attacker’s guess.

Just so I have this correct, the mask done to the token in crypto.go is to make sure the HTTP cookie header and HTML body do not contain the same string/bytes correct?

This is to avoid BREACH / CRIME styled deconstruction of the compression to find repeated strings? Am I understanding this correctly?

If the bytes are random to begin with, why is the token XOR with the one-time-pad (OTP) since it will be unique with every response anyway?

SetBaseCookie not having effect

Hi,

My code:

func (mr *Router) Handle(method string, path string, handler http.Handler) {
	mr.router.Handler(method, path,
		csrfHandler(
			logHandler(logger.Debug)(
				securityHeaderHandler()(
						handler,
				),
			),
		),
	)
}

func csrfHandler(next http.Handler) http.Handler {
	csrfHandler := nosurf.New(next)
	csrfHandler.SetBaseCookie(http.Cookie{
		Name: csrfCookieName, // "csrf"
		Path: "/",
		Domain: "",
		Secure: true,
		HttpOnly: true,
		MaxAge: int(sessionTimeoutSec.Seconds()),
		SameSite: http.SameSiteStrictMode,
	})
	return csrfHandler
}

However, every request without cookie returns:

$ curl -v http://localhost:8080/login 1>/dev/null
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0*   Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 8080 (#0)
> GET /login HTTP/1.1
> Host: localhost:8080
> User-Agent: curl/7.54.0
> Accept: */*
>
< HTTP/1.1 200 OK
< access-control-allow-methods: GET,POST,HEAD,OPTIONS
< x-content-type-options: nosniff
< content-security-policy: default-src 'none'; script-src 'self'; style-src 'self'; img-src 'self' data:; font-src 'self' fonts.gstatic.com
< set-cookie: csrf_token=5BeFIyTJI/fipPPgEcdPcw4t4vTfOZDjYoOCt8/iRSI=; Path=/; Max-Age=900; HttpOnly; Secure; SameSite=Strict

Wiki page for newbies doubts and problems

I'm newbie in everything.

I would like to write a Wiki page for newbies like me. Would you mind, @justinas?

Title: "How to use nosurf with external Single Page Application (SPA) like Ember, React, Angular or jQuery Ajax".

My app is both SPA and server rendered: authentication (using authboss - https://github.com/volatiletech/authboss) and I also have the Javascript part, so I need both the "JSON API" endpoint CSRF protected and the CSRF form values for authboss and something else server rendered.

I'm using chi router (https://github.com/go-chi/chi) like this:

package main

import (
	"net/http"
	"github.com/go-chi/chi"
)

func main() {
	r := chi.NewRouter()
        r.Use(nosurfing, addCookie)
	r.Get("/", func(w http.ResponseWriter, r *http.Request) {
		w.Write([]byte("welcome"))
	})
	http.ListenAndServe(":3000", r)
}

func Nosurfing(h http.Handler) http.Handler {
	surfing := nosurf.New(h)
	surfing.SetBaseCookie(http.Cookie{Path: "/"}) //using this just because I don't know if it's right to create a cookie for every "sub-path" like "/auth" or "/api"; I opened an issue for clarify this: https://github.com/justinas/nosurf/issues/53
	surfing.SetFailureHandler(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
		log.Println("Failed to validate CSRF token:", nosurf.Reason(r))
		w.WriteHeader(http.StatusBadRequest)
	}))
	return surfing
}

func addCookie(handler http.Handler) http.Handler {
	return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
		cookie := &http.Cookie{Name: "nosurf_cookie_for_ajax", Value: nosurf.Token(r), Path: "/"} //using "Path: "/"" here just because I don't know if it's right to create a cookie for every "sub-path" like "/auth" or "/api"; I opened an issue for clarify this: https://github.com/justinas/nosurf/issues/53
		http.SetCookie(w, cookie)
		handler.ServeHTTP(w, r)
	})
}

Now every time I visit a route it creates a cookie named csrf_token and one named nosurf_cookie_for_ajax.

The first one is masked and should be set HTTPOnly and the second one is different everytime and to be read from javascript which has to use it in every POST (or CSRF protected) call with request header named X-CSRF-Token.

Am I right?

The second cookie can be created maybe only if the a user is logged in (if you need this).

Default Security Settings:

I think it's good to write here also the default security settings for cookies:

type Cookie struct {
	Name  string // For cookie 1 I would use the default value. For 2 I can call it "X-CSRF-Token" as Request header needed.
	Value string // For cookie 1 default, for 2 I can use `nosurf.Token(r)`
	Path       string    // 1: "/", 2: "/" but I need to understand better this behaviour, I opened an issue about: https://github.com/justinas/nosurf/issues/53
	Domain     string    // default, like Path I think
	MaxAge   int // here I have some doubts, I think I will leave the nosurf's default for both
	Secure   bool // true (be careful if you are @ localhost)
	HttpOnly bool // true for 1 and false for 2
	SameSite SameSite // up to you, study it, helps for csrf problems
}

Summary:

  • Review the code and tell newbies if there is something bad or wrong
  • Suggest default security settings for cookies
  • Read the second article and tell newbies if it is a bad idea to use a route like /csrf for tokens

Other articles:

Signing Cookies

nosurf does not currently sign cookies as the standard http.Cookie implementation only defines the "basic" attributes of a cookie.

CSRF cookies should be signed (so we can identify attempts to tamper) with HMAC-SHA256, and then authenticated before checking the cookie against the submitted request. An example of a solid authentication implementation can be found here.

  • Authenticated cookies should (really) really be the default, but I'm not sure how to reconcile this without breaking the existing API. I would argue that the benefit from authenticating cookie values outweighs the downside of breaking the API. Any major change would be a compile-time error too, and is therefore easier to resolve (no weird gremlins at run-time).
  • I would also consider updating the README to stress (as it's extremely important) that package users serve their site over HTTPS (SSL/TLS), as otherwise CSRF tokens are effectively lip service given that the cookies themselves can be hijacked (and as their contents aren't authenticated, changed at whim).
  • Encrypted cookies would be an additional "nice to have" but do not really circumvent the need for SSL/TLS. The existing "encryption" references in the docs should also be changed to "masked" or "masking".
  • You would facilitate authentication/encryption by allowing a user to pass in keys in a func (h *CSRFHandler) SetAuthKeys(key []byte, keys ...[]byte) and func (h *CSRFHandler) SetEncryptionKeys(key []byte, keys ...[]byte), with the variadic param allowing a package user to pass in multiple key pairs (which facilitates cycling keys). This would be similar to how gorilla/securecookie handles key rotation, but you could probably get away with just accepting a single key.
  • Leveraging gorilla/securecookie itself may not be a bad idea. You could wrap its exported functions with your own to maintain as much of your existing API as possible, or have nosurf.New accept an options struct that then calls securecookie API before then returning a configured *CSRFHandler.
  • I'd also suggest bringing the default expiry way down to something like a week, tops. Even a day would be fine—users don't take a day to fill out a form.

Vary: Cookie Header

The middleware should set a Vary: Cookie header on CSRF protected pages to force proxies to never cache the page. This should, in most cases, override any Cache-Control or ETag headers set otherwise down the line.

The Django docs cover this nicely, as do the Varnish docs.

Note that in many cases good proxies won't cache the page anyway, but it's worth being sure.

PS: I can send a PR for this, but since it's effectively a one-liner prior to passing the handler, I figured it's probably easier for you to include in your next update/change.

Use only crypto/rand for token generation.

math/rand is cryptographically insecure and thus isn't suitable for token generation. We should switch to only using crypto/rand.

This should be an easy, non-breaking change.

Send a response body in defaultFailureHandler

Would it be possible to change the defaultFailureHandler so it includes a basic response body? Something like:

func defaultFailureHandler(w http.ResponseWriter, r *http.Request) {
	http.Error(w, http.StatusText(FailureCode), FailureCode)
}

Instead of the current http.Error(w, "", FailureCode).

Using http.StatusText() wouldn't leak any extra useful information to an attacker, but it would be a nicer and less confusing user (and developer) experience than just seeing a blank page on failure.

I'm very happy to send a PR for this, if you like.

Validation fails with X-CSRF-Token

I'm only able to get a 400 Bad Request with POST/DELETE requests to my REST application.

Running an app in localhost I have this value in my csrf_token cookie:
0bYcWmFvvMpZXMSgau2Jx3uxQGhyfEtTxOEC6zrtlfs=

And the value passed back in X-CSRF-Token:
0bYcWmFvvMpZXMSgau2Jx3uxQGhyfEtTxOEC6zrtlfs=

My set up:

http.Handle("/", nosurf.New(myRoutes))
http.ListenAndServe(":"+port, nil)

I had this working fine until a recent go get update, so some kind of regression maybe?

Possible flaw

I am relatively new to back end programming and Golang, but I thought it'd be a good idea to post this. I am able to "authenticate" requests on the example code, by passing the csrf_token cookie and form data to cURL like this:

curl http://localhost:8000 -d "name=abcd&csrf_token=dEdKyAmXFbvNZGcWvVcBQVAb8IlVwS10SAFqwSQ/k7IkMvQbmRzMHV4M5V197UPycAEOncxxler1It9TtHbpiA==" --cookie "csrf_token=UHW+05CL2aaTaIJLwLpCsyAa/hSZsLievSO1kpBJejo="

and then the response is

<!doctype html>
<html>
<body>

<p>Your name: abcd</p>

<form action="/" method="POST">
<input type="text" name="name">


<input type="hidden" name="csrf_token" value="U4/bPZUAKZ&#43;wezr8YcWkdEpsJ&#43;2gnLt6UPPXzIhXAKYD&#43;mXuBYvwOSMTuLehf&#43;bHanbZ&#43;TksA&#43;Tt0GJeGB56nA==">
<input type="submit" value="Send">
</form>
</body>
</html>

I'm not completely sure if this is an expected behavior or a flaw.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.