-
Notifications
You must be signed in to change notification settings - Fork 206
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Secure by default #2362
Comments
Should this be a required param - secure by default - with an opt in |
API_SECRET
for the Docker instance
prod instance should prevent any requests going through if you don't setup a secret |
I was chatting with @kevin-dp and @samwillis to clarify how to implement this. Context: Electric should always be running behind a proxy because it exposes the database. The idea is that by default Electric requires passing an API_SECET such that only requests that present that secret will be able accepted by the server. A developer has to setup that key in any of the clients that are authorized to make requests --- the proxy. If we follow this implementation route, the server and any authorized client needs to be updated at the same time in the uncommon scenario of changing the secret. This sounds like an 80/20 solution, but we need to consider that this comes with downtime and operational complexity, since a user might have multiple running apps. The best approach to me seems to do this using a app_key generation endpoint, such that proxy client with a non-expired/revoked key would not lose access to the server even if the secret changes. |
With this secret users will still need to run a proxy because it is unsafe to send the requests with the secret from the client. So the client needs to send a request to the proxy which then sends a request to Electric with the Secret included. Since the user is already running a proxy, it seems that in most cases you would just restrict the Electric server to only accept connections from that proxy server (e.g. via some custom VPC or whatever). Now, i hear that some platforms do not allow securing it this way but how common is that? Which cloud platforms don't support this? Is that a real concern for us right now, do we have any users requesting this?
If i understand this correctly, Electric would have a way to generate API tokens. So Electric keeps a list of valid API tokens, and every request must present one of these tokens. The question is, if Electric exposes an |
I agree. People that run software in production know their network. People running private services all the time and therefore they have came across this issue. Is it a real concern we need to address? The naive user can always setup the secret unsafely if the doesn't understand the arch. If we want to make people aware of the sensitivity with running Electric in public network we should do that through documentation. The fact that you can red your entire database should be enough to raise your attention. wrt. to api endpoint. You'd still need to protect to present some secret, but you decouple that from validating emitted tokens in the past, so that you can change the secret you need to present without revoking access to users that have requested api tokens with old secrets. |
Indeed. From the discussion here, these tokens seem to be API tokens and not JWTs. So, there's no need to sign these tokens with a secret and hence, we don't have the problem of not being able to validate tokens that were generated with a previous secret. What i mean:
This way, even if the |
I want to argue strongly against expanding the scope of this to a more advanced api that issues tokens. We do not need that. The internet is full of stories of people who accidentally expose their whole database to the internet. Mongo was notorious for it. My fear is that we are not the database, but a highly privileged client of it, with a very easy path for a user to leak their db. We do not want Electric to be known for the same issues. Setting up your network to only allow access from specific IP ranges, while not difficult, also isn't trivial. When you then consider that a lot of users will want to run the proxy layer in an edge function on Cloudflare/AWS/Deno, it becomes even more complex to configure that restriction (you have do idea what the ip range could be!). We have to use CF Tunnels in cloud to make to secure our Electric servers. Adding this api token is super easy, and solves the issue for a large number of users. If we are concerned about users doing zero downtime updates of their secret, they can do a phased deploy with multiple secrets in the client that are tried in order. But a user doing zero downtime deploys is also likely to be a highly skilled user who can configure their networks and set a
I don't think it is, people will miss it, or just forget to do it. The point of "secure by default" is exactly that, we remove the possibility of users shooting themself in the foot. |
@samwillis ok. I'm trying to figure out the requirements for this task, hence my questions. |
@kevin-dp Yes, exactly. I think that was the initial title of the issue, and so a little context was lost. |
This is a good argument. I rest my case. I still think that the approach is not the best. We're making it easier to the non-sophisticated developer, but that is exactly the kind that would still misuse the API_SECRET. We're pushing to the developer doing the no-downtime approach, but then they will probably not be able to handle the issue anyways. Offering the endpoint would address that, but we can do that later. I agree we do need this for the edge-based proxy. |
This is another layer of security for protecting access to the Electric API when it's deployed on the internet. Not all hosting platforms make it easy to lock down access so we should have our own way of securing access.
The text was updated successfully, but these errors were encountered: