Skip to content

Commit

Permalink
Subscriptions (#2)
Browse files Browse the repository at this point in the history
* get deps up-to-date, add graphql-ws

* install subscriptions

* did someone say subscriptions?

* blacken

* sure, let's call it a version

* subscribe to something useful
  • Loading branch information
bollwyvl authored Jan 8, 2019
1 parent 1df8647 commit cf80175
Show file tree
Hide file tree
Showing 16 changed files with 480 additions and 529 deletions.
2 changes: 0 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,9 @@ _scripts/
*_files/
*.bundle.*
*.egg-info/
*.html
*.log
*.tar.gz
envs/
lib/
Untitled*.ipynb
static/
anaconda-project.yml
9 changes: 9 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# CHANGELOG

## Unreleased

### 0.2.0
- Subscriptions (in server and GraphiQL)

### 0.1.0
- Basic Capability with contents manager
41 changes: 41 additions & 0 deletions anaconda-project.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
name: jupyter-graphql-dev

commands:
lab:
unix: jupyter lab --no-browser --debug
setup:
unix: pip install -e . --ignore-installed --no-deps
black:
unix: black src/py setup.py
atom:
unix: atom .
static:
unix: python -m jupyter_graphql.fetch_static

env_specs:
default:
platforms:
- linux-64
- osx-64
- win-64
inherit_from:
- jupyter-graphql-dev
packages:
- black
- flake8
- beautysh
jupyter-graphql-dev:
packages:
- gql
- graphene
- iso8601
- jupyterlab >=0.35,<0.36
- python >=3.6,<3.7
- requests
- werkzeug
- pip:
- graphql-ws
- graphene-tornado
channels:
- conda-forge
- defaults
13 changes: 5 additions & 8 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,16 +5,13 @@ channels:
- defaults

dependencies:
- aniso8601
- gql
- graphene
- iso8601
- jupyterlab >=0.35,<0.36
- pip
- promise
- python >=3.6,<3.7
- requests
- rx
- werkzeug
- pip:
- gql
- graphene
- graphql-core
- graphql-relay
- graphql-ws
- graphene-tornado
82 changes: 59 additions & 23 deletions notebooks/gql.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -16,19 +16,26 @@
"metadata": {},
"outputs": [],
"source": [
"import requests # should investigate writing a tornado transport\n",
"from pprint import pprint\n",
"from getpass import getpass\n",
"from gql import Client, gql\n",
"from gql.transport.requests import RequestsHTTPTransport\n",
"from IPython.display import JSON\n",
"from pprint import pprint"
"from IPython.display import JSON, IFrame\n",
"import requests # should investigate writing a tornado transport"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You'll need your `jupyter notebook` or `jupyter lab` token (view source, look for `\"token\"`)"
"## Client!"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This has to know where you are. For example for `http://localhost:8888/lab`:"
]
},
{
Expand All @@ -37,14 +44,14 @@
"metadata": {},
"outputs": [],
"source": [
"token = getpass()"
"URL = \"http://localhost:8888/graphql\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Update to suit!"
"Since this is a _kernel_ talking back to the notebook _server_, you'll need your `jupyter notebook` or `jupyter lab` token (view source, look for `\"token\"`)"
]
},
{
Expand All @@ -53,14 +60,7 @@
"metadata": {},
"outputs": [],
"source": [
"URL = \"http://localhost:8888/graphql\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Make a client!"
"token = getpass()"
]
},
{
Expand All @@ -79,7 +79,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Make a query!"
"## Query!"
]
},
{
Expand All @@ -88,11 +88,11 @@
"metadata": {},
"outputs": [],
"source": [
"query = gql('''\n",
"query {\n",
"query = \"\"\"{\n",
" contents(path: \"notebooks/gql.ipynb\") {\n",
" path\n",
" last_modified\n",
" ... on NotebookContents {\n",
" path\n",
" content {\n",
" nbformat\n",
" nbformat_minor\n",
Expand All @@ -107,8 +107,9 @@
" }\n",
" }\n",
"}\n",
"''')\n",
"query"
"\"\"\"\n",
"query_gql = gql(query)\n",
"query_gql"
]
},
{
Expand All @@ -124,15 +125,50 @@
"metadata": {},
"outputs": [],
"source": [
"result = client.execute(query)\n",
"result = client.execute(query_gql)\n",
"JSON(result)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Where can you go from here?"
"Where can you go from here? "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Subscribe!\n",
"With a little work up-front and even less work at query time, the same types from above can be used to power live _subscriptions_. Right now, only contents are available, but many things in the notebook server and broader ecosystem could become \"live\"."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"subscription = f\"subscription {query}\"\n",
"print(subscription)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Go ahead and paste that in the iframe below and hit (▷)!\n",
"> TODO: fix query param parsing!"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"IFrame(URL, width=\"100%\", height=\"400px\")"
]
}
],
Expand All @@ -152,7 +188,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.6"
"version": "3.6.7"
}
},
"nbformat": 4,
Expand Down
Empty file modified postBuild
100644 → 100755
Empty file.
6 changes: 5 additions & 1 deletion setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,12 @@ classifiers =

[options]
install_requires =
jupyterlab
graphene-tornado
graphql-ws
iso8601
notebook
werkzeug

package_dir =
= src/py
packages = find:
Expand Down
19 changes: 10 additions & 9 deletions src/py/jupyter_graphql/__init__.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,10 @@
from pathlib import Path
from notebook.base.handlers import FileFindHandler


from notebook.utils import url_path_join as ujoin

from .handlers import GraphQLHandler

from .subscriptions import TornadoSubscriptionServer
from .handlers import GraphQLHandler, SubscriptionHandler
from .schema import schema


Expand All @@ -18,6 +17,8 @@ def load_jupyter_server_extension(app):
app.log.info("[graphql] initializing")
web_app = app.web_app

subscription_server = TornadoSubscriptionServer(schema)

# add our templates
web_app.settings["jinja2_env"].loader.searchpath += [TEMPLATES]

Expand All @@ -34,12 +35,12 @@ def app_middleware(next, root, info, **args):
(
base(),
GraphQLHandler,
dict(
schema=schema,
graphiql=True,
nb_app=app,
middleware=[app_middleware],
),
dict(schema=schema, graphiql=True, middleware=[app_middleware]),
),
(
base("subscriptions"),
SubscriptionHandler,
dict(subscription_server=subscription_server, app=app),
),
# serve the graphiql assets
(base("static", "(.*)"), FileFindHandler, dict(path=[STATIC])),
Expand Down
33 changes: 0 additions & 33 deletions src/py/jupyter_graphql/executor.py

This file was deleted.

30 changes: 23 additions & 7 deletions src/py/jupyter_graphql/fetch_static.py
Original file line number Diff line number Diff line change
@@ -1,28 +1,44 @@
from urllib.request import urlretrieve
from urllib.parse import urlparse
import sys

from . import STATIC

# Download the file from `url` and save it locally under `file_name`:

ASSETS = [
JSDELIVR_ASSETS = [
"https://cdn.jsdelivr.net/npm/[email protected]/graphiql.css",
"https://cdn.jsdelivr.net/npm/[email protected]/fetch.min.js",
"https://cdn.jsdelivr.net/npm/[email protected]/umd/react.production.min.js",
"https://cdn.jsdelivr.net/npm/[email protected]/umd/react-dom.production.min.js",
"https://cdn.jsdelivr.net/npm/[email protected]/graphiql.min.js",
# "https://cdn.jsdelivr.net/npm/[email protected]/browser/client.js",
# "https://cdn.jsdelivr.net/npm/[email protected]/dist/fetcher.js",
]

UNPKG_ASSETS = [
"https://unpkg.com/[email protected]/browser/client.js",
"https://unpkg.com/[email protected]/browser/client.js",
]


def fetch_static():
for url in ASSETS:
out = (STATIC / urlparse(url).path[1:]).resolve()
if not out.exists():
def fetch_assets(assets, prefix=None, force=False):
for url in assets:
out = STATIC
if prefix:
out = STATIC / prefix
out = (out / urlparse(url).path[1:]).resolve()
if force or not out.exists():
out.parent.mkdir(parents=True, exist_ok=True)
out.write_text("")
print("fetching", url, "to", out)
print(f"fetching\n\t- {url}\n\t> {out.relative_to(STATIC)}")
urlretrieve(url, out)


def fetch_static(force=False):
fetch_assets(JSDELIVR_ASSETS, force=force)
fetch_assets(UNPKG_ASSETS, "npm", force=force)


if __name__ == "__main__":
fetch_static()
fetch_static(force="--force" in sys.argv)
Loading

0 comments on commit cf80175

Please sign in to comment.