You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: README.md
+30-9
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ Python bindings for the Domino API.
4
4
5
5
Permits interaction with a Domino deployment from Python using the [Domino API](https://dominodatalab.github.io/api-docs/).
6
6
7
-
The latest released version is [1.0.4](https://github.com/dominodatalab/python-domino/archive/1.0.4.zip).
7
+
The latest released version is [1.0.5](https://github.com/dominodatalab/python-domino/archive/1.0.5.zip).
8
8
9
9
## Version Compatibility Matrix
10
10
@@ -15,7 +15,7 @@ The latest released version is [1.0.4](https://github.com/dominodatalab/python-d
15
15
| 3.6.x or Lower |[0.3.5](http://github.com/dominodatalab/python-domino/archive/0.3.5.zip)|
16
16
| 4.1.0 or Higher |[1.0.0](https://github.com/dominodatalab/python-domino/archive/1.0.0.zip) or Higher |
17
17
18
-
## Installation
18
+
## Installation
19
19
20
20
At this time, these Domino Python bindings are not in PyPi. You can install the latest version of this package from our Github `master` branch with the following:
21
21
@@ -30,7 +30,7 @@ You can also add `python-domino` to your `requirements.txt` file with the follow
**domino_token_file:* (Optional) Path to domino token file containing auth token. If not provided the library will expect to find one
54
54
in the DOMINO_TOKEN_FILE environment variable.
55
55
56
-
Note:
56
+
Note:
57
57
1. In case both api_key and domino_token_file are available, then preference will be given to domino_token_file.
58
-
2. By default the log level is set to `INFO`, to set log level to `DEBUG`, set `DOMINO_LOG_LEVEL` environment variable to `DEBUG`
58
+
2. By default the log level is set to `INFO`, to set log level to `DEBUG`, set `DOMINO_LOG_LEVEL` environment variable to `DEBUG`
59
59
<hr>
60
60
61
61
## Methods
@@ -87,7 +87,7 @@ Start a new run on the selected project. The parameters are:
87
87
**isDirect:* (Optional) Whether or not this command should be passed directly to a shell.
88
88
**commitId:* (Optional) The commitId to launch from. If not provided, will launch from latest commit.
89
89
**title:* (Optional) A title for the run.
90
-
**tier:* (Optional) The hardware tier to use for the run. This is the human-readable name of the hardware tier, such as "Free", "Small", or "Medium". Will use project default tier if not provided.
90
+
**tier:* (Optional) The hardware tier to use for the run. This is the human-readable name of the hardware tier, such as "Free", "Small", or "Medium". Will use project default tier if not provided.
91
91
**publishApiEndpoint:* (Optional) Whether or not to publish an API endpoint from the resulting output.
92
92
93
93
<hr>
@@ -163,7 +163,7 @@ Publishes an app in the Domino project, or republish an existing app. The parame
163
163
164
164
### app_unpublish()
165
165
166
-
Stops all running apps in the Domino project.
166
+
Stops all running apps in the Domino project.
167
167
168
168
<hr>
169
169
@@ -177,7 +177,7 @@ Starts a new Job (run) in the project
177
177
**environment_id (string):* (Optional) The environment id to launch job with. If not provided it will use the default environment for the project
178
178
**on_demand_spark_cluster_properties (dict):* (Optional) On demand spark cluster properties. Following properties
179
179
can be provided in spark cluster
180
-
180
+
181
181
```
182
182
{
183
183
"computeEnvironmentId": "<Environment ID configured with spark>"
@@ -191,15 +191,36 @@ Starts a new Job (run) in the project
191
191
(optional defaults to 0; 1GB is 1000MB Here)
192
192
}
193
193
```
194
+
* *param compute_cluster_properties (dict):* (Optional) The compute cluster properties definition contains parameters for
195
+
launching any Domino supported compute cluster for a job. Use this to launch a job that uses a compute cluster instead of
196
+
the deprecated `on_demand_spark_cluster_properties` field. If `on_demand_spark_cluster_properties` and `compute_cluster_properties`
197
+
are both present, `on_demand_spark_cluster_properties` will be ignored. `compute_cluster_properties` contains the following fields:
194
198
199
+
```
200
+
{
201
+
"clusterType": <string, one of "Ray", "Spark">,
202
+
"computeEnvironmentId": <string, The environment ID for the cluster's nodes>,
203
+
"computeEnvironmentRevisionSpec": <one of "ActiveRevision", "LatestRevision",
0 commit comments