Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Secure by default #2362

Open
KyleAMathews opened this issue Feb 21, 2025 · 10 comments · May be fixed by #2404
Open

Secure by default #2362

KyleAMathews opened this issue Feb 21, 2025 · 10 comments · May be fixed by #2404
Assignees

Comments

@KyleAMathews
Copy link
Contributor

This is another layer of security for protecting access to the Electric API when it's deployed on the internet. Not all hosting platforms make it easy to lock down access so we should have our own way of securing access.

@balegas balegas added this to the Next major release milestone Feb 21, 2025
@samwillis
Copy link
Contributor

Should this be a required param - secure by default - with an opt in INSECURE=true that can be used in development or when the user knows they are secure because of their network configuration?

@balegas balegas changed the title Allow people to set a API_SECRET for the Docker instance Secure by default Feb 24, 2025
@balegas
Copy link
Contributor

balegas commented Feb 24, 2025

prod instance should prevent any requests going through if you don't setup a secret

@balegas
Copy link
Contributor

balegas commented Mar 3, 2025

I was chatting with @kevin-dp and @samwillis to clarify how to implement this.

Context: Electric should always be running behind a proxy because it exposes the database. The idea is that by default Electric requires passing an API_SECET such that only requests that present that secret will be able accepted by the server. A developer has to setup that key in any of the clients that are authorized to make requests --- the proxy.

If we follow this implementation route, the server and any authorized client needs to be updated at the same time in the uncommon scenario of changing the secret. This sounds like an 80/20 solution, but we need to consider that this comes with downtime and operational complexity, since a user might have multiple running apps.

The best approach to me seems to do this using a app_key generation endpoint, such that proxy client with a non-expired/revoked key would not lose access to the server even if the secret changes.

@kevin-dp
Copy link
Contributor

kevin-dp commented Mar 3, 2025

Context: Electric should always be running behind a proxy because it exposes the database. The idea is that by default Electric requires passing an API_SECET such that only requests that present that secret will be able accepted by the server. A developer has to setup that key in any of the clients that are authorized to make requests --- the proxy.

With this secret users will still need to run a proxy because it is unsafe to send the requests with the secret from the client. So the client needs to send a request to the proxy which then sends a request to Electric with the Secret included. Since the user is already running a proxy, it seems that in most cases you would just restrict the Electric server to only accept connections from that proxy server (e.g. via some custom VPC or whatever). Now, i hear that some platforms do not allow securing it this way but how common is that? Which cloud platforms don't support this? Is that a real concern for us right now, do we have any users requesting this?

The best approach to me seems to do this using a app_key generation endpoint, such that proxy client with a non-expired/revoked key would not lose access to the server even if the secret changes.

If i understand this correctly, Electric would have a way to generate API tokens. So Electric keeps a list of valid API tokens, and every request must present one of these tokens. The question is, if Electric exposes an api/token/new endpoint to generate a new token how do we secure that endpoint given that the Electric server is publicly available. We could have a special ELECTRIC_SECRET env var which is some kind of admin secret that you need to present in order to generate new API keys. What do you think?

@balegas
Copy link
Contributor

balegas commented Mar 3, 2025

Now, i hear that some platforms do not allow securing it this way but how common is that? Which cloud platforms don't support this? Is that a real concern for us right now, do we have any clients that we want to onboard that are in this situation?

I agree. People that run software in production know their network. People running private services all the time and therefore they have came across this issue. Is it a real concern we need to address? The naive user can always setup the secret unsafely if the doesn't understand the arch.

If we want to make people aware of the sensitivity with running Electric in public network we should do that through documentation. The fact that you can red your entire database should be enough to raise your attention.

wrt. to api endpoint. You'd still need to protect to present some secret, but you decouple that from validating emitted tokens in the past, so that you can change the secret you need to present without revoking access to users that have requested api tokens with old secrets.

@kevin-dp
Copy link
Contributor

kevin-dp commented Mar 3, 2025

wrt. to api endpoint. You'd still need to protect to present some secret, but you decouple that from validating emitted tokens in the past, so that you can change the secret you need to present without revoking access to users that have requested api tokens with old secrets.

Indeed. From the discussion here, these tokens seem to be API tokens and not JWTs. So, there's no need to sign these tokens with a secret and hence, we don't have the problem of not being able to validate tokens that were generated with a previous secret.

What i mean:

  • Electric is deployed with an ELECTRIC_ADMIN_SECRET env var
  • One can generate a new API token for their app by calling the api/token/new endpoint with the same secret as configured via ELECTRIC_ADMIN_SECRET
  • On receiving this request, Electric verifies the secret and if it is valid generates a new API token using a cryptographically secure random function.
  • Electric keeps track of all the API tokens that have been generated (could store the hashes of the API tokens if we want additional security guarantees)
  • On receiving API requests to other routes, Electric checks that the request includes a valid API token.

This way, even if the ELECTRIC_ADMIN_SECRET changes, it does not invalidate previously generated API tokens.

@samwillis
Copy link
Contributor

samwillis commented Mar 3, 2025

I want to argue strongly against expanding the scope of this to a more advanced api that issues tokens. We do not need that.

The internet is full of stories of people who accidentally expose their whole database to the internet. Mongo was notorious for it. My fear is that we are not the database, but a highly privileged client of it, with a very easy path for a user to leak their db. We do not want Electric to be known for the same issues.

Setting up your network to only allow access from specific IP ranges, while not difficult, also isn't trivial. When you then consider that a lot of users will want to run the proxy layer in an edge function on Cloudflare/AWS/Deno, it becomes even more complex to configure that restriction (you have do idea what the ip range could be!). We have to use CF Tunnels in cloud to make to secure our Electric servers.

Adding this api token is super easy, and solves the issue for a large number of users.

If we are concerned about users doing zero downtime updates of their secret, they can do a phased deploy with multiple secrets in the client that are tried in order. But a user doing zero downtime deploys is also likely to be a highly skilled user who can configure their networks and set a INSECURE=true value (or similar).

If we want to make people aware of the sensitivity with running Electric in public network we should do that through documentation. The fact that you can red your entire database should be enough to raise your attention.

I don't think it is, people will miss it, or just forget to do it. The point of "secure by default" is exactly that, we remove the possibility of users shooting themself in the foot.

@kevin-dp
Copy link
Contributor

kevin-dp commented Mar 3, 2025

@samwillis ok. I'm trying to figure out the requirements for this task, hence my questions.
So you would go with a single SECRET configured as an env var and every API request needs to present that secret, right? No rolling secrets, no revocations.

@samwillis
Copy link
Contributor

@kevin-dp Yes, exactly. I think that was the initial title of the issue, and so a little context was lost.

@balegas
Copy link
Contributor

balegas commented Mar 3, 2025

When you then consider that a lot of users will want to run the proxy layer in an edge function on Cloudflare/AWS/Deno, it becomes even more complex to configure that restriction

This is a good argument. I rest my case.

I still think that the approach is not the best. We're making it easier to the non-sophisticated developer, but that is exactly the kind that would still misuse the API_SECRET. We're pushing to the developer doing the no-downtime approach, but then they will probably not be able to handle the issue anyways. Offering the endpoint would address that, but we can do that later. I agree we do need this for the edge-based proxy.

@kevin-dp kevin-dp self-assigned this Mar 3, 2025
@KyleAMathews KyleAMathews linked a pull request Mar 5, 2025 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants