Yeedu (1.0.0)
Download OpenAPI specification:Download
- This matrix defines the abilities available to users with different roles at the tenant level. It specifies what platform administrators, admins, users, and those who manage clusters can do in terms of managing tenants, resources, and configurations.
| Ability | Platform Admin | Admin | Can Manage Cluster | User |
|---|---|---|---|---|
| Tenant Management | ✔ | ✖ | ✖ | ✖ |
| Manage Default Configurations | ✔ | ✖ | ✖ | ✖ |
| Manage Resources | ✔ | ✔ | ✖ | ✖ |
| Manage Dependency Repository | ✔ | ✔ | ✖ | ✖ |
| Upload Files | ✔ | ✔ | ✔ | ✔ |
| Manage Clusters | ✔ | ✔ | ✔ | ✖ |
| Manage Workspaces | ✔ | ✔ | ✔ | ✔ |
- This matrix outlines the permissions users have at the workspace level. It covers abilities to manage, edit, run, or view jobs, notebooks, and other workspace-related resources.
| Ability | Can Manage | Can Edit | Can Run | Can View |
|---|---|---|---|---|
| Workspace Access Management | ✔ | ✖ | ✖ | ✖ |
| Activate or Inactivate the Job, Notebook, Workspace | ✔ | ✔ | ✖ | ✖ |
| Create or Update Job or Notebooks | ✔ | ✔ | ✖ | ✖ |
| Run or Stop the Job or Notebook | ✔ | ✔ | ✔ | ✖ |
| Read All Job or Notebooks and Runs | ✔ | ✔ | ✔ | ✔ |
Check if the API is up and running
Performs a simple check to confirm the API is operational and able to respond to requests.
Use this endpoint to quickly verify the service health status.
Responses
Response samples
- 200
- 500
{- "uptime": 108.585168939,
- "message": "API is up and running",
- "date": "2024-06-20T14:11:23.423Z"
}Retrieve the list of supported cloud providers
Returns all cloud providers supported by the system.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "cloud_provider_id": "0",
- "name": "GCP",
- "description": "Provider for creating infrastructure on Google Cloud Platform",
- "from_date": "2024-06-20T14:08:12.294Z",
- "to_date": null
}, - {
- "cloud_provider_id": "1",
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services",
- "from_date": "2024-06-20T14:08:12.294Z",
- "to_date": null
}, - {
- "cloud_provider_id": "2",
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform",
- "from_date": "2024-06-20T14:08:12.294Z",
- "to_date": null
}
]Get details of a specific cloud provider
Retrieves detailed information about a cloud provider identified by the provided Cloud Provider Id.
Authorizations:
path Parameters
| cloud_provider_id required | integer <int64> Cloud provider Id used for filtering |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "cloud_provider_id": "0",
- "name": "GCP",
- "description": "Provider for creating infrastructure on Google Cloud Platform",
- "from_date": "2024-06-20T14:08:12.294Z",
- "to_date": null
}Retrieve supported disk machine types
Returns a list of all disk machine types supported by the system.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "disk_type_id": "0",
- "cloud_provider": {
- "name": "GCP",
- "description": "Provider for creating infrastructure on Google Cloud Platform"
}, - "name": "pd-ssd",
- "has_fixed_size": false,
- "min_size": 10,
- "max_size": 64000,
- "from_date": "2024-06-20T14:08:12.294Z",
- "to_date": null
}
]Get available machine zones across providers
Retrieves a list of available machine zones from all cloud providers.
Authorizations:
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "availability_zone_id": 175,
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "name": "South Africa West",
- "region": "South Africa West",
- "description": "Africa",
- "from_date": "2024-06-20T14:08:12.310386+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 176,
- "total_pages": 176,
- "limit": 1,
- "next_page": 2
}
}Get machine availability zones for a specific cloud provider.
Retrieves details of machine availability zones filtered by a specific Cloud Provider Id.
Authorizations:
path Parameters
| cloud_provider_id required | integer <int64> Cloud Provider Id used for filtering. |
query Parameters
| all | boolean Default: false Enum: true false A boolean that can be set to return all the data. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "availability_zone_id": 175,
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "name": "South Africa West",
- "region": "South Africa West",
- "description": "Africa",
- "from_date": "2024-06-20T14:08:12.310386+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 176,
- "total_pages": 176,
- "limit": 1,
- "next_page": 2
}
}Get details of a specific machine availability zone.
Retrieve machine availability zone details based on a specific cloud provider and availability zone ID.
Authorizations:
path Parameters
| cloud_provider_id required | integer <int64> Cloud provider Id used for filtering. |
| availability_zone_id required | integer <int64> Availability zone Id used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "availability_zone_id": "108",
- "cloud_provider": {
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services"
}, - "name": "ap-east-1",
- "region": "ap-east-1",
- "description": "Hong Kong, Asia Pacific",
- "from_date": "2024-06-20T14:08:12.310Z",
- "to_date": null
}Retrieve supported machine architecture types
Returns a list of machine architecture types supported by the system.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "id": "0",
- "architecture_type": "aarch64",
- "from_date": "2025-04-08T11:32:17.278Z",
- "to_date": null
}, - {
- "id": "1",
- "architecture_type": "x86_64",
- "from_date": "2025-04-08T11:32:17.278Z",
- "to_date": null
}
]Get all the supported machine types.
Returns a list of machine types supported by the system.
Authorizations:
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "machine_type_id": 241,
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "name": "Standard_NV72ads_A10_v5",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "vCPUs": 72,
- "memory": "880 GiB",
- "has_cuda": true,
- "gpu_model": "NVIDIA A10",
- "gpus": 2,
- "gpu_memory": "48 GB",
- "cpu_model": [
- "AMD EPYC 74F3V(Milan)"
], - "cpu_min_frequency_GHz": [
- "3.2"
], - "cpu_max_frequency_GHz": [
- "4"
], - "has_local_disk": true,
- "local_disk_size_GB": 1400,
- "local_num_of_disks": 1,
- "local_disk_bus_type": {
- "local_disk_bus_type_id": 0,
- "local_disk_bus_type": "SCSI"
}, - "local_disk_throughput_MB": null,
- "machine_price_ycu": 106,
- "from_date": "2024-06-20T14:08:12.30262+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 241,
- "total_pages": 241,
- "limit": 1,
- "next_page": 2
}
}Get all machine types for a specific cloud provider
Retrieves machine types filtered by a specific cloud provider ID.
Authorizations:
path Parameters
| cloud_provider_id required | integer <int64> Cloud Provider Id used for filtering. |
query Parameters
| all | boolean Default: false Enum: true false A boolean that can be set to return all the data. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "machine_type_id": 241,
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "name": "Standard_NV72ads_A10_v5",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "vCPUs": 72,
- "memory": "880 GiB",
- "has_cuda": true,
- "gpu_model": "NVIDIA A10",
- "gpus": 2,
- "gpu_memory": "48 GB",
- "cpu_model": [
- "AMD EPYC 74F3V(Milan)"
], - "cpu_min_frequency_GHz": [
- "3.2"
], - "cpu_max_frequency_GHz": [
- "4"
], - "has_local_disk": true,
- "local_disk_size_GB": 1400,
- "local_num_of_disks": 1,
- "local_disk_bus_type": {
- "local_disk_bus_type_id": 0,
- "local_disk_bus_type": "SCSI"
}, - "local_disk_throughput_MB": null,
- "machine_price_ycu": 106,
- "from_date": "2024-06-20T14:08:12.30262+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 241,
- "total_pages": 241,
- "limit": 1,
- "next_page": 2
}
}Get machine type details for a specific cloud provider and machine type.
Retrieve details of a machine type filtered by a specific Cloud Provider Id and Machine Type Id.
Authorizations:
path Parameters
| cloud_provider_id required | integer <int64> Cloud Provider Id used for filtering. |
| machine_type_id required | integer <int64> Machine Type Id used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "machine_type_id": 108,
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services"
}, - "name": "m6id.4xlarge",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "vCPUs": 16,
- "memory": "64 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon 8375C"
], - "cpu_min_frequency_GHz": [
- "2.9"
], - "cpu_max_frequency_GHz": [
- "3.5"
], - "has_local_disk": true,
- "local_disk_size_GB": 950,
- "local_num_of_disks": 1,
- "local_disk_bus_type": {
- "local_disk_bus_type_id": 1,
- "local_disk_bus_type": "NVME"
}, - "local_disk_throughput_MB": null,
- "machine_price_ycu": 10.4,
- "from_date": "2024-06-20T14:08:12.30262+00:00",
- "to_date": "infinity"
}Retrieve cloud label patterns.
Returns all cloud label patterns supported by the system.
Authorizations:
query Parameters
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "cloud_label_pattern_id": 0,
- "key_regex": "^([a-z]{1})([a-z0-9_\\-]{0,62})?$",
- "value_regex": "^[a-z0-9_-]{0,63}$",
- "description": "Keys must start with a lowercase letter (a-z) and can be up to 63 characters long, including alphanumeric characters, underscores, and hyphens. Values can be up to 63 characters long with the same allowed characters.",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP",
- "description": "Provider for creating infrastructure on Google Cloud Platform"
}, - "from_date": "2025-07-23T08:06:33.359453+00:00",
- "to_date": "infinity"
}, - {
- "cloud_label_pattern_id": 1,
- "key_regex": "^(?!aws:)[a-zA-Z0-9.\\-_@:+ ]{1,127}$",
- "value_regex": "^[a-zA-Z0-9.\\-_@:+ ]{0,255}$",
- "description": "Keys must not start with 'aws:' and can have a max length of 128 characters. Values can have a max length of 256 characters. Both allow spaces, alphanumeric and special characters ('.', '-', '_', '@', ':', '+').",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services"
}, - "from_date": "2025-07-23T08:06:33.359453+00:00",
- "to_date": "infinity"
}, - {
- "cloud_label_pattern_id": 2,
- "key_regex": "^[a-zA-Z0-9.\\-_@:+ ]{1,511}$",
- "value_regex": "^[a-zA-Z0-9.\\-_@:+ ]{0,255}$",
- "description": "Keys can be up to 512 characters, values up to 256 characters. Both allow spaces, alphanumeric and special characters ('.', '-', '_', '@', ':', '+').",
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "from_date": "2025-07-23T08:06:33.359453+00:00",
- "to_date": "infinity"
}
]Retrieve supported credential types
Returns a list of all credential types supported by the system.
Authorizations:
query Parameters
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "credential_type_id": 0,
- "name": "Google Service Account",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP",
- "description": "Provider for creating infrastructure on Google Cloud Platform"
}, - "from_date": "2024-06-20T14:08:12.297945+00:00",
- "to_date": "infinity"
}, - {
- "credential_type_id": 1,
- "name": "AWS Access Secret Key Pair",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services"
}, - "from_date": "2024-06-20T14:08:12.297945+00:00",
- "to_date": "infinity"
}, - {
- "credential_type_id": 2,
- "name": "Azure Service Principal",
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "from_date": "2024-06-20T14:08:12.297945+00:00",
- "to_date": "infinity"
}
]Retrieve supported cluster types
Returns a list of cluster types supported by the system.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "cluster_type_id": "0",
- "name": "YEEDU",
- "description": "Yeedu Mode",
- "has_turbo_support": true,
- "has_cuda_support": true,
- "from_date": "2025-04-10T12:17:39.769Z",
- "to_date": null
}, - {
- "cluster_type_id": "1",
- "name": "STANDALONE",
- "description": "Spark Cluster Single Machine",
- "has_turbo_support": false,
- "has_cuda_support": true,
- "from_date": "2025-04-10T12:17:39.769Z",
- "to_date": null
}, - {
- "cluster_type_id": "2",
- "name": "CLUSTER",
- "description": "Spark Cluster Multiple Machines",
- "has_turbo_support": false,
- "has_cuda_support": true,
- "from_date": "2025-04-10T12:17:39.769Z",
- "to_date": null
}
]Get engine cluster instance statuses
Returns all defined statuses for engine cluster instances.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "engine_cluster_instance_status_id": "1",
- "name": "RUNNING",
- "description": "Cluster was created and its running",
- "from_date": "2024-06-20T14:08:12.296Z",
- "to_date": null
}
]Retrieve Spark compute types
Returns all Spark compute types defined in the system.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "spark_compute_type_id": "0",
- "name": "YEEDU",
- "description": null,
- "from_date": "2024-06-20T14:08:12.314Z",
- "to_date": null
}
]Retrieve Spark infrastructure versions
Returns all supported versions of Spark infrastructure.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "spark_infra_version_id": 5,
- "spark_docker_image_name": "v3.5.3-6",
- "spark_version": "3.5.3",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": false,
- "thrift_support": true,
- "yeedu_functions_support": true,
- "has_turbo_support": true,
- "turbo_version": "v1.0.7",
- "has_unity_support": true,
- "unity_version": "v1.0.7",
- "has_hive_support": true,
- "cuda_rapids_version": "23.04.1",
- "from_date": "2025-07-23T08:03:07.022Z",
- "to_date": null
}
]Get all the Spark job statuses.
Returns all defined statuses for Spark jobs.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "spark_job_status_id": "0",
- "name": "SUBMITTED",
- "description": "Job was submitted, waiting for Application ID",
- "from_date": "2024-03-07T04:55:42.780Z",
- "to_date": null
}
]Retrieve supported Spark job types
Returns a list of Spark job types supported by the system.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "spark_job_type_id": 0,
- "name": "SPARK_JOB",
- "from_date": "2024-09-03T10:19:17.940Z",
- "to_date": null
}, - {
- "spark_job_type_id": 1,
- "name": "NOTEBOOK",
- "from_date": "2024-09-03T10:19:17.940Z",
- "to_date": null
}, - {
- "spark_job_type_id": 2,
- "name": "SPARK_SQL",
- "from_date": "2024-09-03T10:19:17.940Z",
- "to_date": null
}, - {
- "spark_job_type_id": 3,
- "name": "THRIFT_SQL",
- "from_date": "2024-09-03T10:19:17.940Z",
- "to_date": null
}, - {
- "spark_job_type_id": 4,
- "name": "YEEDU_FUNCTIONS",
- "from_date": "2024-09-03T10:19:17.940Z",
- "to_date": null
}
]Retrieve Spark job type languages
Returns all supported languages for Spark job types, optionally filtered by job type.
Authorizations:
query Parameters
| job_type | Array of strings Items Enum: "SPARK_JOB" "SPARK_SQL" "NOTEBOOK" "THRIFT_SQL" "YEEDU_FUNCTIONS" Specifies the Spark Job Type to filter on. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "spark_job_type_lang_id": 0,
- "spark_job_type": {
- "spark_job_type_id": 0,
- "name": "SPARK_JOB"
}, - "lang_name": "RAW_SCALA",
- "from_date": "2024-03-07T04:55:42.783139+00:00",
- "to_date": "infinity",
- "support_runtime_args": false,
- "support_runtime_conf": false
}
]Get all the workflow execution states.
Retrieves a list of workflow execution states defined by the system.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "workflow_execution_state_id": -1,
- "name": "NONE",
- "description": "NONE",
- "from_date": "2024-03-07T04:55:42.777Z",
- "to_date": null
}
]Retrieve supported workflow types
Returns all workflow types defined in the system.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "workflow_type_id": "0",
- "name": "spark_start",
- "queue_name": "yeedu.workflows.usi.start",
- "description": null,
- "from_date": "2024-03-07T04:55:42.778Z",
- "to_date": null
}
]Get supported Linux distributions
Returns a list of Linux distributions supported by the system.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "linux_distro_id": "1",
- "distro_name": "UBUNTU",
- "distro_version": "22.04 LTS",
- "from_date": "2024-05-28T14:47:09.860Z",
- "to_date": null
}
]Retrieve supported Metastore Catalogs
Returns a list of Metastore Catalogs supported by the system.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "metastore_catalog_type_id": "1",
- "name": "DATABRICKS UNITY",
- "description": "",
- "from_date": "2025-03-16T07:08:43.118Z",
- "to_date": null
}
]Get supported secret types
Retrieves a list of supported secret types.
Authorizations:
query Parameters
| secret_supports | string Enum: "unity" "hive" "storage" Optional filter to return only secret types that support a specific platform or integration. Valid values are:
|
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "lookup_secret_type_id": "1",
- "name": "HIVE KERBEROS",
- "description": "Kerberos principal and keytab used to authenticate clients against a secure Hive service.",
- "supports_unity": false,
- "supports_hive": true,
- "supports_storage_type": null,
- "from_date": "2025-05-07T13:16:36.338608+00:00",
- "to_date": "infinity"
}
]Get all the storage types.
Returns all storage types supported by the system.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "lookup_storage_type_id": 1,
- "name": "Google Cloud Storage",
- "description": "Scalable, secure object storage by Google Cloud for global data access.",
- "from_date": "2025-05-07T13:16:36.338608+00:00",
- "end_date": null
}
]Get all the machine volume configurations.
Returns a list of machine volume configurations, optionally filtered by cloud provider.
Authorizations:
query Parameters
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| all | boolean Default: false Enum: true false A boolean that can be set to return all the data. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "volume_conf_id": 24,
- "name": "volume_gcp_22",
- "description": null,
- "encrypted": true,
- "size": 2000,
- "disk_type": {
- "disk_type_id": 2,
- "cloud_provider": {
- "cloud_provider_id": "0,",
- "name": "GCP"
}, - "name": "local-ssd",
- "has_fixed_size": true,
- "min_size": 375,
- "max_size": 375,
- "has_fixed_iops": true,
- "min_iops": 170000,
- "max_iops": 3200000,
- "has_fixed_throughput": true,
- "min_throughput": 660,
- "max_throughput": 12480
}, - "machine_volume_num": 2,
- "machine_volume_strip_num": 1,
- "disk_iops": 170000,
- "disk_throughput_MB": 660,
- "created_by": null,
- "modified_by": null,
- "last_update_date": "2024-06-25T14:54:39.980421+00:00",
- "from_date": "2024-06-25T14:54:39.980421+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 24,
- "total_pages": 24,
- "limit": 1,
- "next_page": 2
}
}Search machine volume configurations by name
Searches for machine volume configurations matching the provided name. Supports pagination and optional filtering by cloud provider.
Authorizations:
query Parameters
| volume_conf_name required | string Volume configuration name used for filtering. |
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "volume_conf_id": 24,
- "name": "volume_gcp_22",
- "description": null,
- "encrypted": true,
- "size": 2000,
- "disk_type": {
- "disk_type_id": 2,
- "cloud_provider": {
- "cloud_provider_id": "0,",
- "name": "GCP"
}, - "name": "local-ssd",
- "has_fixed_size": true,
- "min_size": 375,
- "max_size": 375,
- "has_fixed_iops": true,
- "min_iops": 170000,
- "max_iops": 3200000,
- "has_fixed_throughput": true,
- "min_throughput": 660,
- "max_throughput": 12480
}, - "machine_volume_num": 2,
- "machine_volume_strip_num": 1,
- "disk_iops": 170000,
- "disk_throughput_MB": 660,
- "created_by": null,
- "modified_by": null,
- "last_update_date": "2024-06-25T14:54:39.980421+00:00",
- "from_date": "2024-06-25T14:54:39.980421+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 24,
- "total_pages": 24,
- "limit": 1,
- "next_page": 2
}
}Create a new volume configuration.
Creates a new machine volume configuration using the details provided in the request body. Returns the created configuration upon success.
Authorizations:
Request Body schema: application/jsonrequired
The volume configuration to be added.
| name required | string non-empty The
|
| description | string or null non-empty |
| encrypted required | boolean |
| size | integer <int64> |
| disk_type_id required | integer <int64> |
| machine_volume_num required | integer <int64> >= 1 |
| machine_volume_strip_num required | integer <int64> >= 1 |
| disk_iops | integer <int64> |
| disk_throughput_MB | integer <int64> |
Responses
Request samples
- Payload
{- "name": "yeedu_volume",
- "description": "dev volume",
- "encrypted": true,
- "size": 100,
- "disk_type_id": 0,
- "machine_volume_num": 1,
- "machine_volume_strip_num": 1
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "volume_conf_id": "25",
- "name": "yeedu_volume",
- "description": "dev volume",
- "encrypted": true,
- "size": "100",
- "disk_type_id": "0",
- "disk_type_name": "pd-ssd",
- "machine_volume_num": 1,
- "machine_volume_strip_num": 1,
- "disk_iops": null,
- "disk_throughput_MB": null,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-03-08T05:34:55.306Z",
- "from_date": "2024-03-08T05:34:55.306Z",
- "to_date": null
}Get details of a specific volume configuration.
Retrieves detailed information about a volume configuration, filtered by volume configuration ID or name.
Authorizations:
query Parameters
| volume_conf_id | integer <int64> Volume configuration ID used for filtering. |
| volume_conf_name | string Volume configuration name used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "volume_conf_id": 18,
- "volume_conf_name": "volume_gcp_22",
- "description": null,
- "encrypted": true,
- "size": 500,
- "disk_type": {
- "disk_type_id": 2,
- "cloud_provider": {
- "cloud_provider_id": "0,",
- "name": "GCP"
}, - "name": "local-ssd",
- "has_fixed_size": true,
- "min_size": 375,
- "max_size": 375,
- "has_fixed_iops": true,
- "min_iops": 170000,
- "max_iops": 3200000,
- "has_fixed_throughput": true,
- "min_throughput": 660,
- "max_throughput": 12480
}, - "machine_volume_num": 1,
- "machine_volume_strip_num": 1,
- "disk_iops": 170000,
- "disk_throughput_MB": 660,
- "created_by": null,
- "modified_by": null,
- "last_update_date": "2024-06-25T14:54:39.980421+00:00",
- "from_date": "2024-06-25T14:54:39.980421+00:00",
- "to_date": "infinity"
}Update details of a specific volume configuration.
Updates an existing volume configuration identified by volume configuration ID or name with the details provided in the request body. Returns the updated configuration.
Authorizations:
query Parameters
| volume_conf_id | integer <int64> Volume configuration ID used for modification. |
| volume_conf_name | string Volume configuration name used for modification. |
Request Body schema: application/jsonrequired
The volume configuration to be modified.
| name | string non-empty The
|
| description | string or null non-empty |
| encrypted | boolean |
| machine_volume_num | integer <int64> |
| machine_volume_strip_num | integer <int64> |
| size | integer <int64> |
| disk_type_id | integer <int64> |
| disk_iops | integer or null <int64> |
| disk_throughput_MB | integer or null <int64> |
Responses
Request samples
- Payload
{- "name": "yeedu_volume",
- "description": "dev volume",
- "encrypted": true,
- "machine_volume_num": 1,
- "machine_volume_strip_num": 1,
- "size": 100,
- "disk_type_id": 0,
- "disk_iops": 3000,
- "disk_throughput_MB": 125
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "volume_conf_id": "25",
- "name": "yeedu_volume",
- "description": "dev volume",
- "encrypted": true,
- "size": "100",
- "disk_type_id": "0",
- "disk_type_name": "pd-ssd",
- "machine_volume_num": 1,
- "machine_volume_strip_num": 1,
- "disk_iops": null,
- "disk_throughput_MB": null,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-03-08T05:34:55.306Z",
- "from_date": "2024-03-08T05:34:55.306Z",
- "to_date": null
}Delete a specific volume configuration.
Deletes a volume configuration identified by volume configuration ID or name. Returns confirmation of the deletion.
Authorizations:
query Parameters
| volume_conf_id | integer <int64> Volume configuration ID used for deletion. |
| volume_conf_name | string Volume configuration name used for deletion. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted machine volume configuration ID: 1."
}Get all the machine network configurations.
Returns a paginated list of machine network configurations, optionally filtered by cloud provider.
Authorizations:
query Parameters
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| all | boolean Default: false Enum: true false A boolean that can be set to return all the data. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "network_conf_id": 7,
- "network_conf_name": "azure_network",
- "description": null,
- "network_project_id": "yeedu-project",
- "network_name": "yeedu-network",
- "network_tags": [
- "yeedu",
- "iap-allow"
], - "subnet": "yeedu-subnet",
- "availability_zone": {
- "availability_zone_id": 129,
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "name": "East US",
- "region": "East US",
- "description": "US, North America"
}, - "tenant_id": "ec234745-4625-4abc-9ae7-1f8a9a9090b3",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-06-27T06:16:43.073805+00:00",
- "from_date": "2024-06-27T06:16:43.073805+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 3,
- "limit": 1,
- "next_page": 2
}
}Search machine network configurations by name
Searches for machine network configurations matching the provided network configuration name. Supports pagination and optional filtering by cloud provider.
Authorizations:
query Parameters
| network_conf_name required | string Network configuration name used for filtering. |
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "network_conf_id": 7,
- "network_conf_name": "azure_network",
- "description": null,
- "network_project_id": "yeedu-project",
- "network_name": "yeedu-network",
- "network_tags": [
- "yeedu",
- "iap-allow"
], - "subnet": "yeedu-subnet",
- "availability_zone": {
- "availability_zone_id": 129,
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "name": "East US",
- "region": "East US",
- "description": "US, North America"
}, - "tenant_id": "ec234745-4625-4abc-9ae7-1f8a9a9090b3",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-06-27T06:16:43.073805+00:00",
- "from_date": "2024-06-27T06:16:43.073805+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 3,
- "limit": 1,
- "next_page": 2
}
}Create a new network configuration.
Creates a new machine network configuration using the details provided in the request body. Returns the created network configuration upon success.
Authorizations:
Request Body schema: application/jsonrequired
The network configuration to be added.
| name required | string non-empty The
|
| description | string or null non-empty |
| network_project_id required | string non-empty |
| network_name required | string non-empty |
| network_tags | Array of strings unique [ items non-empty ] |
| subnet required | string non-empty |
| availability_zone_id required | integer <int64> |
Responses
Request samples
- Payload
{- "name": "yeedu_network",
- "description": "Yeedu machine network config.",
- "network_project_id": "yeedu",
- "network_name": "yeedu-spark-vpc",
- "network_tags": [
- "yeedu",
- "iap-allow"
], - "subnet": "custom-subnet-yeedu",
- "availability_zone_id": 75
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "network_conf_id": "1",
- "name": "yeedu_network",
- "description": "Yeedu machine network config.",
- "network_project_id": "yeedu",
- "network_name": "yeedu-spark-vpc",
- "network_tags": [
- "yeedu",
- "iap-allow"
], - "subnet": "custom-subnet-yeedu",
- "availability_zone_id": "75",
- "tenant_id": "eeaf2b29-d2b0-4c32-8d74-4c9b565ce8bd",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-03-08T13:45:11.117224+00:00",
- "from_date": "2024-03-08T13:45:11.117224+00:00",
- "to_date": null
}Get details of a specific network configuration.
Retrieves detailed information about a network configuration, filtered by network configuration ID or name.
Authorizations:
query Parameters
| network_conf_id | integer <int64> Network configuration ID used for filtering. |
| network_conf_name | string Network configuration name used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "network_conf_id": 1,
- "network_conf_name": "aws_network",
- "description": null,
- "network_project_id": "1234-4569-8765",
- "network_name": "eni-0123d56edd7c8fe9f",
- "network_tags": [
- "sg-0ac12ceaffcd345fa"
], - "subnet": "subnet-0d12345a67890d12d",
- "availability_zone": {
- "availability_zone_id": 103,
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services"
}, - "name": "us-east-2",
- "region": "us-east-2",
- "description": "Ohio, US East"
}, - "tenant_id": "ec234745-4625-4abc-9ae7-1f8a9a9090b3",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-06-26T04:55:34.255071+00:00",
- "from_date": "2024-06-26T04:55:34.255071+00:00",
- "to_date": "infinity"
}Update details of a specific network configuration.
Updates an existing network configuration identified by network configuration ID or name with the provided details. Returns the updated network configuration upon success.
Authorizations:
query Parameters
| network_conf_id | integer <int64> Network configuration ID used for modification. |
| network_conf_name | string Network configuration name used for modification. |
Request Body schema: application/jsonrequired
The network configuration to be modified.
| name | string non-empty The
|
| description | string or null non-empty |
| network_project_id | string non-empty |
| network_name | string non-empty |
| network_tags | Array of strings unique [ items non-empty ] |
| subnet | string non-empty |
| availability_zone_id | integer <int64> |
Responses
Request samples
- Payload
{- "name": "yeedu_network",
- "description": "Yeedu machine network config.",
- "network_project_id": "yeedu",
- "network_name": "yeedu-spark-vpc",
- "network_tags": [
- "yeedu",
- "iap-allow"
], - "subnet": "custom-subnet-yeedu",
- "availability_zone_id": 75
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "network_conf_id": "1",
- "name": "yeedu_network",
- "description": "Yeedu machine network config.",
- "network_project_id": "yeedu",
- "network_name": "yeedu-spark-vpc",
- "network_tags": [
- "yeedu",
- "iap-allow"
], - "subnet": "custom-subnet-yeedu",
- "availability_zone_id": "75",
- "tenant_id": "eeaf2b29-d2b0-4c32-8d74-4c9b565ce8bd",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-03-08T13:45:11.117224+00:00",
- "from_date": "2024-03-08T13:45:11.117224+00:00",
- "to_date": null
}Delete a specific network configuration.
Deletes a machine network configuration identified by network configuration ID or name. Returns confirmation upon successful deletion.
Authorizations:
query Parameters
| network_conf_id | integer <int64> Network configuration ID used for deletion. |
| network_conf_name | string Network configuration name used for deletion. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted machine network configuration ID: 1."
}Get all the boot disk image configurations.
Returns a paginated list of boot disk image configurations, optionally filtered by cloud provider and machine architecture type.
Authorizations:
query Parameters
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| architecture_type | string Enum: "x86_64" "aarch64" Machine Architecture Type used for filtering. |
| all | boolean Default: false Enum: true false A boolean that can be set to return all the data. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "boot_disk_image_id": 5,
- "boot_disk_image_name": "azure_rhel",
- "description": "Base image for Azure RHEL",
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "linux_distro": {
- "linux_distro_id": 2,
- "distro_name": "RHEL",
- "distro_version": "8"
}, - "boot_disk_image": "RedHat:RHEL:8-lvm-gen2:latest",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "created_by": null,
- "modified_by": null,
- "last_update_date": "2024-06-25T14:54:39.990791+00:00",
- "from_date": "2024-06-25T14:54:39.990791+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 5,
- "total_pages": 5,
- "limit": 1,
- "next_page": 2
}
}Search boot disk image configurations by name
Searches for boot disk image configurations matching the provided name. Supports pagination and optional filtering by cloud provider and machine architecture type.
Authorizations:
query Parameters
| boot_disk_image_name required | string Boot disk image configuration name used for filtering. |
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| architecture_type | string Enum: "x86_64" "aarch64" Machine Architecture Type used for filtering. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "boot_disk_image_id": 5,
- "boot_disk_image_name": "azure_rhel",
- "description": "Base image for Azure RHEL",
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "linux_distro": {
- "linux_distro_id": 2,
- "distro_name": "RHEL",
- "distro_version": "8"
}, - "boot_disk_image": "RedHat:RHEL:8-lvm-gen2:latest",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "created_by": null,
- "modified_by": null,
- "last_update_date": "2024-06-25T14:54:39.990791+00:00",
- "from_date": "2024-06-25T14:54:39.990791+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 5,
- "total_pages": 5,
- "limit": 1,
- "next_page": 2
}
}Create a new boot disk image configuration.
Creates a new boot disk image configuration using the details provided in the request body. Returns the created configuration upon success.
Authorizations:
Request Body schema: application/jsonrequired
The boot disk image configuration to be added.
| name required | string non-empty The
|
| description | string or null non-empty |
| boot_disk_image required | string non-empty |
| cloud_provider_id required | integer <int64> |
| linux_distro_id required | integer <int64> |
| architecture_type_id required | integer <int64> |
Responses
Request samples
- Payload
{- "name": "dev_boot_disk",
- "description": "yeedu dev boot disk",
- "boot_disk_image": "ubuntu-os-cloud/ubuntu-2004-lts",
- "cloud_provider_id": 0,
- "linux_distro_id": 0,
- "architecture_type_id": 0
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "boot_disk_image_id": "1",
- "boot_disk_image_name": "dev_boot_disk",
- "description": "yeedu dev boot disk",
- "cloud_provider_id": 0,
- "linux_distro_id": 0,
- "boot_disk_image": "ubuntu-os-cloud/ubuntu-2004-lts",
- "architecture_type_id": 0,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-03-07T04:55:42.799Z",
- "from_date": "2024-03-07T04:55:42.799Z",
- "to_date": null
}Get details of a specific boot disk image configuration.
Retrieves detailed information about a boot disk image configuration, filtered by boot disk image configuration ID or name.
Authorizations:
query Parameters
| boot_disk_image_id | integer <int64> Boot disk image configuration ID used for filtering. |
| boot_disk_image_name | string Boot disk image configuration name used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "boot_disk_image_id": 2,
- "boot_disk_image_name": "gcp_rhel",
- "description": "Base image for GCP RHEL",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP",
- "description": "Provider for creating infrastructure on Google Cloud Platform"
}, - "linux_distro": {
- "linux_distro_id": 2,
- "distro_name": "RHEL",
- "distro_version": "8"
}, - "boot_disk_image": "rhel-cloud/rhel-8",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "created_by": null,
- "modified_by": null,
- "last_update_date": "2024-06-25T14:54:39.990791+00:00",
- "from_date": "2024-06-25T14:54:39.990791+00:00",
- "to_date": "infinity"
}Update details of a specific boot disk image configuration.
Updates an existing boot disk image configuration identified by ID or name with the provided details. Returns the updated configuration upon success.
Authorizations:
query Parameters
| boot_disk_image_id | integer <int64> Boot disk image configuration ID used for modification. |
| boot_disk_image_name | string Boot disk image configuration name used for modification. |
Request Body schema: application/jsonrequired
The boot disk image configuration to be modified.
| name | string non-empty The
|
| description | string or null non-empty |
| boot_disk_image | string non-empty |
| linux_distro_id | integer <int64> |
Responses
Request samples
- Payload
{- "name": "dev_boot_disk",
- "description": "yeedu dev boot disk",
- "boot_disk_image": "ubuntu-os-cloud/ubuntu-2004-lts",
- "linux_distro_id": 0
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "boot_disk_image_id": "1",
- "boot_disk_image_name": "dev_boot_disk",
- "description": "yeedu dev boot disk",
- "cloud_provider_id": 0,
- "linux_distro_id": 0,
- "boot_disk_image": "ubuntu-os-cloud/ubuntu-2004-lts",
- "architecture_type_id": 0,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-03-07T04:55:42.799Z",
- "from_date": "2024-03-07T04:55:42.799Z",
- "to_date": null
}Delete a specific boot disk image configuration.
Deletes a boot disk image configuration identified by ID or name. Returns confirmation upon successful deletion.
Authorizations:
query Parameters
| boot_disk_image_id | integer <int64> Boot disk image configuration ID used for deletion. |
| boot_disk_image_name | string Boot disk image configuration name used for deletion. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted boot disk image configuration ID: 1."
}Get all the credentials configurations.
Returns a paginated list of credentials configurations, optionally filtered by cloud provider.
Authorizations:
query Parameters
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| all | boolean Default: false Enum: true false A boolean that can be set to return all the data. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "credentials_conf_id": 6,
- "credentials_conf_name": "azure_credential",
- "description": null,
- "credential_type": {
- "credential_type_id": 2,
- "name": "Azure Service Principal",
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}
}, - "tenant_id": "ec234745-4625-4abc-9ae7-1f8a9a9090b3",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-06-27T06:19:09.111391+00:00",
- "from_date": "2024-06-27T06:19:09.111391+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 3,
- "limit": 1,
- "next_page": 2
}
}Search credentials configurations based on credentials configuration name.
Searches for credentials configurations matching the provided name. Supports pagination and optional filtering by cloud provider.
Authorizations:
query Parameters
| credentials_conf_name required | string Credentials configuration name used for filtering. |
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "credentials_conf_id": 6,
- "credentials_conf_name": "azure_credential",
- "description": null,
- "credential_type": {
- "credential_type_id": 2,
- "name": "Azure Service Principal",
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}
}, - "tenant_id": "ec234745-4625-4abc-9ae7-1f8a9a9090b3",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-06-27T06:19:09.111391+00:00",
- "from_date": "2024-06-27T06:19:09.111391+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 3,
- "limit": 1,
- "next_page": 2
}
}Create a new credentials configuration.
Creates a new credentials configuration with the provided credentials.
Accepted keys for the base64_encoded_credentials:
Cloud Provider Credential Type Accepted Keys GCP Google Service Account ['TYPE', 'PROJECT_ID', 'PRIVATE_KEY_ID', 'PRIVATE_KEY', 'CLIENT_EMAIL', 'CLIENT_ID', 'AUTH_URI', 'TOKEN_URI', 'AUTH_PROVIDER_X509_CERT_URL', 'CLIENT_X509_CERT_URL'] AWS Access Secret Key Pair ['AWS_ACCESS_KEY_ID', 'AWS_SECRET_ACCESS_KEY', 'AWS_DEFAULT_REGION'] Azure Azure Service Principal ['CLIENT_ID', 'TENANT_ID', 'CLIENT_SECRET', 'SUBSCRIPTION_ID', 'STORAGE_ACCOUNT_NAME', 'CONTAINER_NAME']
Authorizations:
Request Body schema: application/jsonrequired
The credentials configuration to be added.
| name required | string non-empty The
|
| description | string or null non-empty |
| credential_type_id required | integer <int64> |
required | object |
Responses
Request samples
- Payload
{- "name": "yeedu-aws",
- "description": "yeedu credentials configuration",
- "credential_type_id": 1,
- "base64_encoded_credentials": {
- "encoded": "ewogIkFXU19BQ0NFU1NfS0VZX0lEIjogIkFXU19BQ0NFU1NfS0VZX0lEIiwKICJBV1NfU0VDUkVUX0FDQ0VTU19LRVkiOiAiQVdTX1NFQ1JFVF9BQ0NFU1NfS0VZIiwKICJBV1NfREVGQVVMVF9SRUdJT04iOiJBV1NfREVGQVVMVF9SRUdJT04iCn0="
}
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "credentials_conf_id": "1",
- "name": "yeedu-svc",
- "description": "yeedu credentials configuration",
- "credential_type_id": "0",
- "tenant_id": "d9d98a22-5216-4955-b3d9-b0337d8ac0d9",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-03-08T12:18:49.334Z",
- "from_date": "2024-03-08T12:18:49.334Z",
- "to_date": null
}Get details of a specific credentials configuration.
Retrieves detailed information about a credentials configuration, filtered by credentials configuration ID or name.
Authorizations:
query Parameters
| credentials_conf_id | integer <int64> Credentials configuration ID used for filtering. |
| credentials_conf_name | string Credentials configuration name used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "credentials_conf_id": 1,
- "credentials_conf_name": "aws_credential",
- "description": null,
- "credential_type": {
- "credential_type_id": 1,
- "name": "AWS Access Secret Key Pair",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services"
}
}, - "tenant_id": "ec234745-4625-4abc-9ae7-1f8a9a9090b3",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-06-26T04:58:48.922845+00:00",
- "from_date": "2024-06-26T04:58:48.922845+00:00",
- "to_date": "infinity"
}Update details of a specific credentials configuration.
Updates an existing credentials configuration with the provided credentials.
Accepted keys for the base64_encoded_credentials:
Cloud Provider Credential Type Accepted Keys GCP Google Service Account ['TYPE', 'PROJECT_ID', 'PRIVATE_KEY_ID', 'PRIVATE_KEY', 'CLIENT_EMAIL', 'CLIENT_ID', 'AUTH_URI', 'TOKEN_URI', 'AUTH_PROVIDER_X509_CERT_URL', 'CLIENT_X509_CERT_URL'] AWS Access Secret Key Pair ['AWS_ACCESS_KEY_ID', 'AWS_SECRET_ACCESS_KEY', 'AWS_DEFAULT_REGION'] Azure Azure Service Principal ['CLIENT_ID', 'TENANT_ID', 'CLIENT_SECRET', 'SUBSCRIPTION_ID', 'STORAGE_ACCOUNT_NAME', 'CONTAINER_NAME']
Authorizations:
query Parameters
| credentials_conf_id | integer <int64> Credentials configuration ID used for modification. |
| credentials_conf_name | string Credentials configuration name used for modification. |
Request Body schema: application/jsonrequired
The credentials configuration to be modified.
| name | string non-empty The
|
| description | string or null non-empty |
object |
Responses
Request samples
- Payload
{- "name": "yeedu-aws",
- "description": "yeedu credentials configuration",
- "base64_encoded_credentials": {
- "encoded": "ewogIkFXU19BQ0NFU1NfS0VZX0lEIjogIkFXU19BQ0NFU1NfS0VZX0lEIiwKICJBV1NfU0VDUkVUX0FDQ0VTU19LRVkiOiAiQVdTX1NFQ1JFVF9BQ0NFU1NfS0VZIiwKICJBV1NfREVGQVVMVF9SRUdJT04iOiJBV1NfREVGQVVMVF9SRUdJT04iCn0="
}
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "credentials_conf_id": "1",
- "name": "yeedu-svc",
- "description": "yeedu credentials configuration",
- "credential_type_id": "0",
- "tenant_id": "d9d98a22-5216-4955-b3d9-b0337d8ac0d9",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-03-08T12:18:49.334Z",
- "from_date": "2024-03-08T12:18:49.334Z",
- "to_date": null
}Delete a specific credentials configuration.
Deletes a credentials configuration identified by ID or name. Returns confirmation upon successful deletion.
Authorizations:
query Parameters
| credentials_conf_id | integer <int64> Credentials configuration ID used for deletion. |
| credentials_conf_name | string Credentials configuration name used for deletion. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted credentials configuration ID: 1"
}Get all the cloud environments.
Returns a paginated list of cloud environments, optionally filtered by cloud provider and machine architecture type.
Authorizations:
query Parameters
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| architecture_type | string Enum: "x86_64" "aarch64" Machine Architecture Type used for filtering. |
| all | boolean Default: false Enum: true false A boolean that can be set to return all the data. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "cloud_env_id": 21,
- "cloud_env_name": "azure_cloud_env",
- "description": "",
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "availability_zone": {
- "availability_zone_id": 128,
- "name": "Central US",
- "region": "Central US",
- "description": "US, North America"
}, - "network_config": {
- "network_conf_id": 7,
- "name": "azure_test_network",
- "description": "creating network config in Azure cloud",
- "network_project_id": "yeedu-project",
- "network_name": "yeedu-network",
- "network_tags": [
- "yeedu",
- "iap-allow"
], - "subnet": "yeedu-subnet",
- "availability_zone": {
- "availability_zone_id": 129,
- "name": "East US",
- "region": "East US",
- "description": "US, North America"
}
}, - "cloud_project": "12345678",
- "credential_config": {
- "credential_conf_id": 8,
- "name": "azure_test_credential",
- "description": null,
- "credential_type": {
- "credential_type_id": 2,
- "name": "Azure Service Principal"
}
}, - "boot_disk_image": {
- "boot_disk_image_id": 4,
- "name": "azure_ubuntu",
- "description": "Base image for Azure Ubuntu",
- "linux_distro": {
- "linux_distro_id": 1,
- "distro_name": "UBUNTU",
- "distro_version": "22.04 LTS"
}, - "boot_disk_image": "Canonical:0001-com-ubuntu-server-jammy:22_04-lts-gen2:latest",
- "architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}
}, - "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-05-02T09:15:58.363635+00:00",
- "from_date": "2024-05-02T09:15:58.363635+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 11,
- "total_pages": 11,
- "limit": 1,
- "next_page": 2
}
}Search cloud environments based on cloud environment name.
Searches for cloud environments matching the provided name. Supports pagination and optional filtering by cloud provider and machine architecture type.
Authorizations:
query Parameters
| cloud_env_name required | string Cloud environment name used for filtering. |
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| architecture_type | string Enum: "x86_64" "aarch64" Machine Architecture Type used for filtering. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "cloud_env_id": 21,
- "cloud_env_name": "azure_cloud_env",
- "description": "",
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "availability_zone": {
- "availability_zone_id": 128,
- "name": "Central US",
- "region": "Central US",
- "description": "US, North America"
}, - "network_config": {
- "network_conf_id": 7,
- "name": "azure_test_network",
- "description": "creating network config in Azure cloud",
- "network_project_id": "yeedu-project",
- "network_name": "yeedu-network",
- "network_tags": [
- "yeedu",
- "iap-allow"
], - "subnet": "yeedu-subnet",
- "availability_zone": {
- "availability_zone_id": 129,
- "name": "East US",
- "region": "East US",
- "description": "US, North America"
}
}, - "cloud_project": "12345678",
- "credential_config": {
- "credential_conf_id": 8,
- "name": "azure_test_credential",
- "description": null,
- "credential_type": {
- "credential_type_id": 2,
- "name": "Azure Service Principal"
}
}, - "boot_disk_image": {
- "boot_disk_image_id": 4,
- "name": "azure_ubuntu",
- "description": "Base image for Azure Ubuntu",
- "linux_distro": {
- "linux_distro_id": 1,
- "distro_name": "UBUNTU",
- "distro_version": "22.04 LTS"
}, - "boot_disk_image": "Canonical:0001-com-ubuntu-server-jammy:22_04-lts-gen2:latest",
- "architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}
}, - "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-05-02T09:15:58.363635+00:00",
- "from_date": "2024-05-02T09:15:58.363635+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 11,
- "total_pages": 11,
- "limit": 1,
- "next_page": 2
}
}Create a new cloud environment.
Creates a new cloud environment using the details provided in the request body. Returns the created cloud environment upon success.
Authorizations:
Request Body schema: application/jsonrequired
The cloud environment to be added.
| name required | string non-empty The
|
| description | string or null non-empty |
| cloud_provider_id required | integer <int32> |
| availability_zone_id required | integer <int64> |
| network_conf_id required | integer <int64> |
| cloud_project required | string non-empty |
| credential_config_id required | integer <int64> |
| boot_disk_image_id required | integer <int64> |
Responses
Request samples
- Payload
{- "name": "yeedu_cloud_env",
- "description": "yeedu cloud environment",
- "cloud_provider_id": 0,
- "availability_zone_id": 75,
- "network_conf_id": 1,
- "cloud_project": "yeedu",
- "credential_config_id": 1,
- "boot_disk_image_id": 1
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "cloud_env_id": 1,
- "name": "yeedu_cloud_env",
- "description": "yeedu cloud environment",
- "cloud_provider_id": "0",
- "availability_zone_id": "75",
- "network_conf_id": "1",
- "cloud_project": "yeedu",
- "credential_config_id": "1",
- "boot_disk_image_id": "1",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-03-08T13:45:15.0882Z",
- "from_date": "2024-03-08T13:45:15.0882Z",
- "to_date": null
}Get details of a specific cloud environment.
Retrieves detailed information about a cloud environment, filtered by cloud environment ID or name.
Authorizations:
query Parameters
| cloud_env_id | integer <int64> Cloud environment ID used for filtering. |
| cloud_env_name | string Cloud environment name used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "cloud_env_id": 21,
- "cloud_env_name": "azure_cloud_env",
- "description": "",
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure",
- "description": "Provider for creating infrastructure on Azure Cloud Platform"
}, - "availability_zone": {
- "availability_zone_id": 128,
- "name": "Central US",
- "region": "Central US",
- "description": "US, North America"
}, - "network_config": {
- "network_conf_id": 7,
- "name": "azure_test_network",
- "description": "creating network config in Azure cloud",
- "network_project_id": "yeedu-project",
- "network_name": "yeedu-network",
- "network_tags": [
- "yeedu",
- "iap-allow"
], - "subnet": "yeedu-subnet",
- "availability_zone": {
- "availability_zone_id": 129,
- "name": "East US",
- "region": "East US",
- "description": "US, North America"
}
}, - "cloud_project": "12345678",
- "credential_config": {
- "credential_conf_id": 8,
- "name": "azure_test_credential",
- "description": null,
- "credential_type": {
- "credential_type_id": 2,
- "name": "Azure Service Principal"
}
}, - "boot_disk_image": {
- "boot_disk_image_id": 4,
- "name": "azure_ubuntu",
- "description": "Base image for Azure Ubuntu",
- "linux_distro": {
- "linux_distro_id": 1,
- "distro_name": "UBUNTU",
- "distro_version": "22.04 LTS"
}, - "architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "boot_disk_image": "Canonical:0001-com-ubuntu-server-jammy:22_04-lts-gen2:latest"
}, - "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-05-02T09:15:58.363635+00:00",
- "from_date": "2024-05-02T09:15:58.363635+00:00",
- "to_date": "infinity"
}Update details of a specific cloud environment.
Updates an existing cloud environment identified by ID or name with the provided details. Returns the updated cloud environment upon success.
Authorizations:
query Parameters
| cloud_env_id | integer <int64> Cloud environment ID used for modification. |
| cloud_env_name | string Cloud environment name used for modification. |
Request Body schema: application/jsonrequired
The cloud environment details to be modified.
| name | string non-empty The
|
| description | string or null non-empty |
| availability_zone_id | integer <int64> |
| network_conf_id | integer <int64> |
| cloud_project | string non-empty |
| credential_config_id | integer <int64> |
| boot_disk_image_id | integer <int64> |
Responses
Request samples
- Payload
{- "name": "gcp_cloud_env",
- "description": "yeedu cloud environment",
- "availability_zone_id": 75,
- "network_conf_id": 1,
- "cloud_project": "yeedu",
- "credential_config_id": 1,
- "boot_disk_image_id": 1
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "cloud_env_id": 1,
- "name": "yeedu_cloud_env",
- "description": "yeedu cloud environment",
- "cloud_provider_id": "0",
- "availability_zone_id": "75",
- "network_conf_id": "1",
- "cloud_project": "yeedu",
- "credential_config_id": "1",
- "boot_disk_image_id": "1",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-03-08T13:45:15.0882Z",
- "from_date": "2024-03-08T13:45:15.0882Z",
- "to_date": null
}Delete a specific cloud environment.
Deletes a cloud environment identified by ID or name. Returns confirmation upon successful deletion.
Authorizations:
query Parameters
| cloud_env_id | integer <int64> Cloud environment ID used for deletion. |
| cloud_env_name | string Cloud environment name used for deletion. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted cloud environment ID: 1."
}Get all the object storage manager configurations.
Returns a paginated list of object storage manager configurations, optionally filtered by cloud provider.
Authorizations:
query Parameters
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| all | boolean Default: false Enum: true false A boolean that can be set to return all the data. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "object_storage_manager_id": 1,
- "object_storage_manager_name": "yeedu_osm",
- "description": "Yeedu Object Storage Manager",
- "cloud_provider": {
- "name": "GCP",
- "description": "Provider for creating infrastructure on Google Cloud Platform"
}, - "object_storage_bucket_name": "yeedu-dev",
- "credentials_config": {
- "name": "yeedu_credential_f24842b5-7e38-40f4-8a5d-98f75c269bd8",
- "credential_type": {
- "name": "Google Service Account"
}
}, - "tenant_id": "cf1f945f-01ce-4ac6-a070-8c733f2fa791",
- "created_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "modified_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "last_update_date": "2023-09-29T15:49:59.477968+00:00",
- "from_date": "2023-09-29T15:49:59.477968+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Search object storage manager configurations based on object storage manager name.
Searches for object storage manager configurations that match the provided name filter. Supports pagination and optional filtering by cloud provider.
Authorizations:
query Parameters
| object_storage_manager_name required | string Object storage manager name used for filtering |
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "object_storage_manager_id": 1,
- "object_storage_manager_name": "yeedu_osm",
- "description": "Yeedu Object Storage Manager",
- "cloud_provider": {
- "name": "GCP",
- "description": "Provider for creating infrastructure on Google Cloud Platform"
}, - "object_storage_bucket_name": "yeedu-dev",
- "credentials_config": {
- "name": "yeedu_credential_f24842b5-7e38-40f4-8a5d-98f75c269bd8",
- "credential_type": {
- "name": "Google Service Account"
}
}, - "tenant_id": "cf1f945f-01ce-4ac6-a070-8c733f2fa791",
- "created_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "modified_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "last_update_date": "2023-09-29T15:49:59.477968+00:00",
- "from_date": "2023-09-29T15:49:59.477968+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Create a new object storage manager configuration
Creates a new object storage manager configuration with provided object storage manager configuration.
Authorizations:
Request Body schema: application/jsonrequired
The object storage manager configuration to be added.
| name required | string non-empty The
|
| description | string or null non-empty |
| cloud_provider_id required | integer <int32> |
| credentials_conf_id required | integer <int64> |
| object_storage_bucket_name required | string non-empty |
Responses
Request samples
- Payload
{- "name": "yeedu_osm",
- "description": "Yeedu Object Storage Manager",
- "cloud_provider_id": 0,
- "credentials_conf_id": 1,
- "object_storage_bucket_name": "yeedu"
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "object_storage_manager_id": "1",
- "name": "yeedu_osm",
- "description": "Yeedu Object Storage Manager",
- "cloud_provider_id": "0",
- "credentials_conf_id": "1",
- "object_storage_bucket_name": "yeedu",
- "tenant_id": "d9d98a22-5216-4955-b3d9-b0337d8ac0d9",
- "created_by_user_id": "3",
- "modified_by_user_id": "3",
- "last_update_date": "2023-03-17T09:31:42.221Z",
- "from_date": "2023-03-17T09:31:42.221Z",
- "to_date": null
}Get details of a specific object storage manager configuration.
Retrieves detailed information about a specific object storage manager configuration, filtered by configuration ID or name.
Authorizations:
query Parameters
| object_storage_manager_id | integer <int64> Object storage manager configuration ID used for filtering. |
| object_storage_manager_name | string Object storage manager configuration name used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "object_storage_manager_id": 1,
- "object_storage_manager_name": "yeedu_osm",
- "description": "Yeedu Object Storage Manager",
- "cloud_provider": {
- "name": "GCP",
- "description": "Provider for creating infrastructure on Google Cloud Platform"
}, - "object_storage_bucket_name": "yeedu-dev",
- "credentials_config": {
- "name": "yeedu_credential_f24842b5-7e38-40f4-8a5d-98f75c269bd8",
- "credential_type": {
- "name": "GCP"
}
}, - "tenant_id": "cf1f945f-01ce-4ac6-a070-8c733f2fa791",
- "created_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "modified_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "last_update_date": "2023-09-29T15:49:59.477968+00:00",
- "from_date": "2023-09-29T15:49:59.477968+00:00",
- "to_date": "infinity"
}Update details of a specific object storage manager configuration.
Updates a previously created object storage manager configuration identified by ID or name, using the details provided in the request body. Returns the updated configuration on success.
Authorizations:
query Parameters
| object_storage_manager_id | integer <int64> Object storage manager configuration ID used for modification. |
| object_storage_manager_name | string Object storage manager configuration name used for modification. |
Request Body schema: application/jsonrequired
Object storage manager configuration details to be modified.
| name | string non-empty The
|
| description | string or null non-empty |
| credentials_conf_id | integer <int64> |
Responses
Request samples
- Payload
{- "name": "yeedu_osm",
- "description": "Yeedu Object Storage Manager",
- "credentials_conf_id": 1
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "object_storage_manager_id": "1",
- "name": "yeedu_osm",
- "description": "Yeedu Object Storage Manager",
- "cloud_provider_id": "0",
- "credentials_conf_id": "1",
- "object_storage_bucket_name": "yeedu",
- "tenant_id": "d9d98a22-5216-4955-b3d9-b0337d8ac0d9",
- "created_by_user_id": "3",
- "modified_by_user_id": "3",
- "last_update_date": "2023-03-17T09:31:42.221Z",
- "from_date": "2023-03-17T09:31:42.221Z",
- "to_date": null
}Delete a specific object storage manager configuration.
Deletes a previously created object storage manager configuration identified by ID or name. Returns confirmation upon successful deletion.
Authorizations:
query Parameters
| object_storage_manager_id | integer <int64> Object storage manager configuration ID used for deletion. |
| object_storage_manager_name | string Object storage manager configuration name used for deletion. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted Object Storage Manager Configuration: 1"
}Get all the object storage manager files.
Retrieves a paginated list of files stored within a specific object storage manager configuration, filtered by configuration ID or name. Supports filtering by file ID, path, and a recursive flag to list files within directories.
Authorizations:
query Parameters
| object_storage_manager_id | integer <int64> Object storage manager configuration ID used for filtering. |
| object_storage_manager_name | string Object storage manager configuration name used for filtering. |
| file_id | integer <int64> Object storage manager file ID used for filtering. |
| file_path | string Object storage manager file path used for filtering. |
| recursive | boolean Default: false Enum: true false Boolean flag to indicate whether to list files recursively. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "object_storage_manager_file_id": 68,
- "object_storage_manager": {
- "object_storage_manager_id": 1,
- "object_storage_manager_name": "aws_health_transformation_osm",
- "description": null,
- "cloud_provider": {
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services"
}, - "credentials_config": {
- "credential_config_id": 1,
- "name": "aws_genomic_security",
- "description": null,
- "credential_type": {
- "credential_type_id": 1,
- "name": "AWS Access Secret Key Pair"
}
}, - "object_storage_bucket_name": "yeedu"
}, - "file_name": "hive-jdbc-1.0.jar",
- "full_file_path": "file:///yeedu/object-storage-manager/hive-jdbc-1.0.jar",
- "file_size_bytes": "18255759",
- "file_type": "jar",
- "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 6,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 6,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-04-25T15:17:04.683271+00:00",
- "from_date": "2024-04-25T15:17:04.683271+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 12,
- "total_pages": 12,
- "limit": 1,
- "next_page": 2
}
}Upload file to an object storage manager.
Upload a file to a specified object storage manager configuration ID or name.
- overwrite: A boolean parameter indicating whether the file should overwrite any existing file in the object storage manager. The default value is false.
- is_dir: A boolean parameter indicating whether to create a directory or upload a file.
- path: The path used to specify the file location when uploading a directory. It indicates the directory structure where the file will be stored.
- target_dir: The target directory, used as a prefix, for the destination when uploading a directory or file to the object storage manager.
Authorizations:
query Parameters
| object_storage_manager_id | integer <int64> Object storage manager configuration ID used for filtering. |
| object_storage_manager_name | string Object storage manager configuration name used for filtering. |
| overwrite | boolean Default: false Enum: true false Boolean flag to indicate whether to overwrite existing files. |
| is_dir | boolean Default: false Enum: true false A boolean flag indicating whether the uploaded file is part of a directory upload. |
| path required | string non-empty The path used to specify the file location when uploading a directory. |
| target_dir | string The target directory, used as a prefix, for the destination when uploading a directory or file. |
header Parameters
| x-file-size | number The header for file size |
Request Body schema: optional
File to upload to an object storage manager.
Metadata or control object.
Responses
Request samples
- Payload
{ }Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "object_storage_manager_file_id": "76",
- "object_storage_manager_id": "20",
- "file_name": "spark-examples_2.12-3.2.2.jar",
- "full_file_path": "file:///yeedu/object-storage-manager/spark-examples_2.12-3.2.2.jar",
- "file_size_bytes": "1560870",
- "file_type": "jar",
- "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-05-08T06:26:52.461Z",
- "from_date": "2024-05-08T06:26:52.461Z",
- "to_date": null
}Search object storage manager files based on file name.
Searches for files within a specific object storage manager configuration by file name. Supports filtering by configuration ID or name, file ID, file path, and recursive search.
Authorizations:
query Parameters
| file_name required | string Object storage manager file name used for filtering. |
| object_storage_manager_id | integer <int64> Object storage manager configuration ID used for filtering. |
| object_storage_manager_name | string Object storage manager configuration name used for filtering. |
| file_id | integer <int64> Object storage manager file ID used for filtering. |
| file_path | string Object storage manager file path used for filtering. |
| recursive | boolean Default: false Enum: true false Boolean flag to indicate whether to search files recursively. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "object_storage_manager_file_id": 68,
- "object_storage_manager": {
- "object_storage_manager_id": 1,
- "object_storage_manager_name": "aws_health_transformation_osm",
- "description": null,
- "cloud_provider": {
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services"
}, - "credentials_config": {
- "credential_config_id": 1,
- "name": "aws_genomic_security",
- "description": null,
- "credential_type": {
- "credential_type_id": 1,
- "name": "AWS Access Secret Key Pair"
}
}, - "object_storage_bucket_name": "yeedu"
}, - "file_name": "hive-jdbc-1.0.jar",
- "full_file_path": "file:///yeedu/object-storage-manager/hive-jdbc-1.0.jar",
- "file_size_bytes": "18255759",
- "file_type": "jar",
- "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 6,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 6,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-04-25T15:17:04.683271+00:00",
- "from_date": "2024-04-25T15:17:04.683271+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 12,
- "total_pages": 12,
- "limit": 1,
- "next_page": 2
}
}Get details of a specific object storage manager file.
Retrieves metadata and details of a specific file stored within an object storage manager configuration. Filters by configuration ID or name and file ID or path, allowing precise file queries.
Authorizations:
query Parameters
| object_storage_manager_id | integer <int64> Object storage manager configuration ID used for filtering. |
| object_storage_manager_name | string Object storage manager configuration name used for filtering. |
| file_id | integer <int64> Object storage manager file ID used for filtering. |
| file_path | string Object storage manager file path used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "object_storage_manager_file_id": 68,
- "object_storage_manager": {
- "object_storage_manager_id": 1,
- "object_storage_manager_name": "aws_health_transformation_osm",
- "description": null,
- "cloud_provider": {
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services"
}, - "credentials_config": {
- "credential_config_id": 1,
- "name": "aws_genomic_security",
- "description": null,
- "credential_type": {
- "credential_type_id": 1,
- "name": "AWS Access Secret Key Pair"
}
}, - "object_storage_bucket_name": "yeedu"
}, - "file_name": "hive-jdbc-1.0.jar",
- "full_file_path": "file:///yeedu/object-storage-manager/hive-jdbc-1.0.jar",
- "file_size_bytes": "18255759",
- "file_type": "jar",
- "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 6,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 6,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-04-25T15:17:04.683271+00:00",
- "from_date": "2024-04-25T15:17:04.683271+00:00",
- "to_date": "infinity"
}Delete a specific object storage manager file.
Deletes a specific file identified by configuration ID or name, and by file ID or path. Returns confirmation upon successful removal.
Authorizations:
query Parameters
| object_storage_manager_id | integer <int64> Object storage manager configuration ID used for filtering. |
| object_storage_manager_name | string Object storage manager configuration name used for filtering. |
| file_id | integer <int64> Object storage manager file ID used for deletion. |
| file_path | string Object storage manager file path used for deletion. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "The File yeedu/object-storage-manager/8cee6100-7086-4138-92fd-712046174e91/spark-sql_2.12-3.2.0.jar has been deleted."
}Download object storage manager files or directories.
Downloads file content or entire directories stored in an object storage manager configuration. Filters allow specifying configuration ID or name, and file ID or path.
Authorizations:
query Parameters
| object_storage_manager_id | integer <int64> Object storage manager configuration ID used for filtering. |
| object_storage_manager_name | string Object storage manager configuration name used for filtering. |
| file_id | integer <int64> Object storage manager file ID used for filtering. |
| file_path | string Object storage manager file path used for filtering. |
Responses
Response samples
- 400
- 401
- 404
- 500
{- "error_code": "string",
- "error_message": "string"
}Create a new Metastore Hive Catalog
Creates a new Metastore Catalog of type Hive with the provided catalog configuration details.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
Authorizations:
Request Body schema: application/jsonrequired
The Metastore Hive Catalog to be added.
| name required | string non-empty The
|
| description | string or null non-empty |
| hiveSiteXml | string or null non-empty |
| coreSiteXml | string or null non-empty |
| hdfsSiteXml | string or null non-empty |
| krb5Conf | string or null non-empty |
Responses
Request samples
- Payload
{- "name": "yeedu_hive_catalog",
- "description": "This is a test hive description",
- "hiveSiteXml": "string",
- "coreSiteXml": "string",
- "hdfsSiteXml": "string",
- "krb5Conf": "string"
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_id": "10",
- "name": "yeedu_hive_conf_updated_1",
- "description": "This is a test hive description",
- "catalog_type": "HIVE",
- "metastore_hive_catalog": {
- "metastore_hive_catalog_id": "4",
- "metastore_secrets_tenant_id": null,
- "object_storage_secrets_tenant_id": null
}, - "tenant_id": "234e28f0-c93f-4b27-b5b0-83c4e4b1dae2",
- "created_by_user_id": "3",
- "modified_by_user_id": "3",
- "last_update_date": "2025-04-07T16:13:16.503Z",
- "from_date": "2025-04-07T16:13:16.503Z",
- "to_date": null
}Update an existing Metastore Hive Catalog
Updates a specific Hive Metastore Catalog identified by 'metastore_catalog_id'.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
- Any cluster attached to the provided metastore must be in DESTROYED or ERROR state.
Authorizations:
query Parameters
| metastore_catalog_id required | integer <int64> Metastore Hive Catalog ID used for modification. |
Request Body schema: application/jsonrequired
Metastore Hive Catalog details to be modified.
| name | string non-empty The
|
| description | string or null non-empty |
| hiveSiteXml | string or null non-empty |
| coreSiteXml | string or null non-empty |
| hdfsSiteXml | string or null non-empty |
| krb5Conf | string or null non-empty |
Responses
Request samples
- Payload
{- "name": "yeedu_hive_catalog",
- "description": "This is a test hive description",
- "hiveSiteXml": "string",
- "coreSiteXml": "string",
- "hdfsSiteXml": "string",
- "krb5Conf": "string"
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_id": "10",
- "name": "yeedu_hive_conf_updated_1",
- "description": "This is a test hive description",
- "catalog_type": "HIVE",
- "metastore_hive_catalog": {
- "metastore_hive_catalog_id": "4",
- "metastore_secrets_tenant_id": null,
- "object_storage_secrets_tenant_id": null
}, - "tenant_id": "234e28f0-c93f-4b27-b5b0-83c4e4b1dae2",
- "created_by_user_id": "3",
- "modified_by_user_id": "3",
- "last_update_date": "2025-04-07T16:13:16.503Z",
- "from_date": "2025-04-07T16:13:16.503Z",
- "to_date": null
}Delete a Metastore Hive Catalog
Deletes the specified Hive Metastore Catalog by its ID.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
- Any cluster attached to the provided metastore must be in DESTROYED or ERROR state.
Authorizations:
query Parameters
| metastore_catalog_id required | integer <int64> Metastore Hive Catalog ID used for deletion. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted Hive Metastore Configuration Id: 1"
}Create a new Glue Catalog
API to create a new Metastore Glue Catalog.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
Authorizations:
Request Body schema: application/jsonrequired
The Metastore Glue Catalog to be added.
| name required | string non-empty The
|
| description | string or null non-empty |
Responses
Request samples
- Payload
{- "name": "yeedu_glue_catalog",
- "description": "This is a test glue description"
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_id": "10",
- "name": "yeedu_glue_conf_updated_1",
- "description": "This is a test glue description",
- "catalog_type": "AWS GLUE",
- "metastore_glue_catalog": {
- "metastore_glue_catalog_id": "4"
}, - "tenant_id": "234e28f0-c93f-4b27-b5b0-83c4e4b1dae2",
- "created_by_user_id": "3",
- "modified_by_user_id": "3",
- "last_update_date": "2025-04-07T16:13:16.503Z",
- "from_date": "2025-04-07T16:13:16.503Z",
- "to_date": null
}Update a Metastore Glue Catalog
Update a specific Metastore Glue Catalog of type 'AWS GLUE'.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
- Any cluster attached to the provided metastore must be in DESTROYED or ERROR state.
Authorizations:
query Parameters
| metastore_catalog_id required | integer <int64> Metastore Glue Catalog ID used for modification. |
Request Body schema: application/jsonrequired
Metastore Glue Catalog details to be modified.
| name | string non-empty The
|
| description | string or null non-empty |
Responses
Request samples
- Payload
{- "name": "yeedu_Glue_catalog",
- "description": "This is a test Glue description"
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_id": "10",
- "name": "yeedu_glue_conf_updated_1",
- "description": "This is a test glue description",
- "catalog_type": "AWS GLUE",
- "metastore_glue_catalog": {
- "metastore_glue_catalog_id": "4"
}, - "tenant_id": "234e28f0-c93f-4b27-b5b0-83c4e4b1dae2",
- "created_by_user_id": "3",
- "modified_by_user_id": "3",
- "last_update_date": "2025-04-07T16:13:16.503Z",
- "from_date": "2025-04-07T16:13:16.503Z",
- "to_date": null
}Delete a Metastore Glue.
Deletes a specific Metastore Catalog of type 'AWS GLUE'.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
- Any cluster attached to the provided metastore must be in DESTROYED or ERROR state.
Authorizations:
query Parameters
| metastore_catalog_id required | integer <int64> Metastore Glue Catalog ID used for deletion. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted Glue Metastore Configuration Id: 1"
}Retrieve Metastore Catalogs
Retrieves a paginated list of Metastore catalogs. Supports optional filtering by catalog ID and catalog type.
Authorizations:
query Parameters
| metastore_catalog_id | integer <int64> Example: metastore_catalog_id=1 Filter by metastore catalog ID. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
| catalog_type | string Enum: "HIVE" "DATABRICKS UNITY" "AWS GLUE" Type of catalog to filter the metastore catalogs. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "metastore_catalog_id": 1,
- "name": "databricks_unity_metastore",
- "description": "This metastore refers to the metastore of databricks unity",
- "catalog_type": "DATABRICKS UNITY",
- "metastore_details": {
- "metastore_unity_catalog_id": "1,",
- "default_catalog": "dev,",
- "storage_path": "s3://yeedu-bucket/dev"
}, - "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 4,
- "username": "ya0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-03-19T04:47:23.824+00:00",
- "from_date": "2025-03-19T04:47:23.824+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 2,
- "limit": 1,
- "next_page": 2
}
}Search Metastore Catalogs by name
Searches Metastore catalogs by name with pagination and filtering support.
Authorizations:
query Parameters
| metastore_catalog_name required | string Example: metastore_catalog_name=unity Search by metastore catalog name. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
| catalog_type | string Enum: "HIVE" "DATABRICKS UNITY" "AWS GLUE" Type of catalog to filter the metastore catalogs. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "metastore_catalog_id": 1,
- "name": "databricks_unity_metastore",
- "description": "This metastore refers to the metastore of databricks unity",
- "catalog_type": "DATABRICKS UNITY",
- "metastore_details": {
- "metastore_unity_catalog_id": "1,",
- "default_catalog": "dev,",
- "storage_path": "s3://yeedu-bucket/dev"
}, - "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 4,
- "username": "ya0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-03-19T04:47:23.824+00:00",
- "from_date": "2025-03-19T04:47:23.824+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 2,
- "limit": 1,
- "next_page": 2
}
}Create a Metastore Unity Catalog
Creates a new Metastore catalog of type 'DATABRICKS UNITY'.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
Authorizations:
Request Body schema: application/jsonrequired
The Metastore Unity Catalog configuration to be added.
| name required | string Name of the unity metastore catalog.
|
| description | string Description of the unity metastore catalog (optional).
|
| endpoint required | string Endpoint for the unity metastore. |
| default_catalog required | string Default catalog for the unity metastore. |
| storage_path required | string Cloud storage path where the data is stored. |
Responses
Request samples
- Payload
{- "name": "databricks_unity_metastore",
- "description": "This metastore refers to the metastore of databricks unity",
- "default_catalog": "dev",
- "storage_path": "s3://yeedu-bucket/dev"
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_id": 1,
- "name": "databricks_unity_metastore",
- "description": "This metastore refers to the metastore of databricks unity",
- "catalog_type": "DATABRICKS UNITY",
- "metastore_details": {
- "metastore_unity_catalog_id": 1,
- "default_catalog": "dev",
- "storage_path": "s3://yeedu-bucket/dev",
- "metastore_secrets_tenant_id": 1,
- "object_storage_secrets_tenant_id": 2
}, - "tenant_id": "3c5f2335-8a9c-46c8-bd1e-97420686e216",
- "created_by_user_id": 1,
- "modified_by_user_id": 1,
- "last_update_date": "2025-03-19T04:47:23.824Z",
- "from_date": "2025-03-19T04:47:23.824Z",
- "to_date": "infinity"
}Update a Metastore Unity Catalog
Update a specific Metastore Unity Catalog of type 'DATABRICKS UNITY' by 'metastore_catalog_id'.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
- Any cluster attached to the provided metastore must be in DESTROYED or ERROR state.
Authorizations:
query Parameters
| metastore_catalog_id required | integer <int64> Example: metastore_catalog_id=1 The ID of the Metastore Unity Catalog to be updated. |
Request Body schema: application/jsonrequired
The Metastore Unity Catalog configuration to be updated.
| name | string Name of the unity metastore catalog.
|
| description | string or null non-empty Description of the unity metastore catalog (optional).
|
| endpoint | string Endpoint for the unity metastore. |
| default_catalog | string Default catalog for the unity metastore. |
| storage_path | string Cloud storage path where the data is stored. |
Responses
Request samples
- Payload
{- "name": "databricks_unity_metastore",
- "description": "This metastore refers to the metastore of databricks unity",
- "default_catalog": "dev",
- "storage_path": "s3://yeedu-bucket/dev"
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_id": 1,
- "name": "databricks_unity_metastore",
- "description": "This metastore refers to the metastore of databricks unity",
- "catalog_type": "DATABRICKS UNITY",
- "metastore_details": {
- "metastore_unity_catalog_id": 1,
- "default_catalog": "dev",
- "storage_path": "s3://yeedu-bucket/dev",
- "metastore_secrets_tenant_id": 1,
- "object_storage_secrets_tenant_id": 2
}, - "tenant_id": "3c5f2335-8a9c-46c8-bd1e-97420686e216",
- "created_by_user_id": 1,
- "modified_by_user_id": 1,
- "last_update_date": "2025-03-19T04:47:23.824Z",
- "from_date": "2025-03-19T04:47:23.824Z",
- "to_date": "infinity"
}Delete a Metastore Unity Catalog
Deletes a specific Metastore Unity Catalog of type 'DATABRICKS UNITY' by ID.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
- Any cluster attached to the provided metastore must be in DESTROYED or ERROR state.
Authorizations:
query Parameters
| metastore_catalog_id required | integer <int64> Example: metastore_catalog_id=1 The ID of the Metastore Unity Catalog to be deleted. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "message": "Metastore Unity Catalog 1 deleted successfully."
}Retrieve Metastore Catalogs and their associated secret tenants
Retrieves a paginated list of metastore catalogs along with their associated secret tenants.
Supports filtering by secret tenant ID, catalog type (DATABRICKS UNITY, HIVE, AWS GLUE), metastore catalog ID, and name.
Authorizations:
query Parameters
| metastore_catalog_secret_tenant_id | integer <int64> Example: metastore_catalog_secret_tenant_id=1 The ID of the Metastore Catalog Secret Tenant to filter by. |
| catalog_type | string Enum: "DATABRICKS UNITY" "HIVE" "AWS GLUE" Filter by catalog type. Only DATABRICKS UNITY, HIVE are supported. |
| metastore_catalog_id | integer <int64> Example: metastore_catalog_id=1 The ID of the Metastore Catalog to filter by. |
| metastore_catalog_name | string Example: metastore_catalog_name=unity The name of the Metastore Catalog to filter by. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "metastore_catalog_tenant_secret_id": 1,
- "name": "databricks_unity_metastore",
- "description": "This metastore refers to the metastore of databricks unity",
- "catalog_type": "DATABRICKS UNITY",
- "metastore_secrets_tenant": {
- "secrets_tenant_id": "1,",
- "name": "databricks_token_secret",
- "description": "This is a secret refereing to databricks unity token",
- "secret_type": "DATABRICKS UNITY TOKEN"
}, - "object_storage_secrets_tenant": {
- "secrets_tenant_id": "2,",
- "name": "databricks_cloud_storage_secret",
- "description": "This is a secret refereing to cloud storage of AWS",
- "secret_type": null
}, - "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 4,
- "username": "ya0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-03-19T04:47:23.824+00:00",
- "from_date": "2025-03-19T04:47:23.824+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 2,
- "limit": 1,
- "next_page": 2
}
}Attach a Metastore Catalog to a secret tenant
Associates a Metastore Catalog with a secret tenant.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN, CAN MANAGE CLUSTER to attach a metastore with secret tenants.
- metastore_secrets_tenant_id is only supported for catalog types DATABRICKS UNITY and HIVE. It is not supported for AWS_GLUE.
Authorizations:
Request Body schema: application/jsonrequired
The Metastore Unity Catalog configuration to be added.
| metastore_catalog_id required | integer <int64> |
| metastore_secrets_tenant_id required | integer <int64> Secret for the Metastore Catalog.
|
| object_storage_secrets_tenant_id | integer <int64> Secret for the cloud storage.
|
Responses
Request samples
- Payload
{- "metastore_catalog_id": 0,
- "metastore_secrets_tenant_id": 0,
- "object_storage_secrets_tenant_id": 0
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_secret_tenant_id": 1,
- "metastore_catalog_id": 1,
- "metastore_secrets_tenant_id": 1,
- "object_storage_secrets_tenant_id": 2,
- "created_by_user_id": 1,
- "modified_by_user_id": 1,
- "last_update_date": "2025-03-19T04:47:23.824Z",
- "from_date": "2025-03-19T04:47:23.824Z",
- "to_date": "infinity"
}Update a Metastore Catalog Secret Tenant association
Updates the secret tenant details linked to a Metastore Catalog.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
- metastore_secrets_tenant_id is only supported for catalog types DATABRICKS UNITY and HIVE. It is not supported for AWS_GLUE.
Authorizations:
Request Body schema: application/jsonrequired
The Metastore Catalog secret to be updated.
| metastore_catalog_secret_tenant_id required | integer <int64> |
| metastore_secrets_tenant_id | integer or null <int64> Secret for the Metastore Catalog.
|
| object_storage_secrets_tenant_id | integer or null <int64> Secret for the cloud storage.
|
Responses
Request samples
- Payload
{- "metastore_catalog_secret_tenant_id": 0,
- "metastore_secrets_tenant_id": 0,
- "object_storage_secrets_tenant_id": 0
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_id": 1,
- "name": "databricks_unity_metastore",
- "description": "This metastore refers to the metastore of databricks unity",
- "catalog_type": "DATABRICKS UNITY",
- "metastore_details": {
- "metastore_unity_catalog_id": 1,
- "default_catalog": "dev",
- "storage_path": "s3://yeedu-bucket/dev",
- "metastore_secrets_tenant_id": 1,
- "object_storage_secrets_tenant_id": 2
}, - "tenant_id": "3c5f2335-8a9c-46c8-bd1e-97420686e216",
- "created_by_user_id": 1,
- "modified_by_user_id": 1,
- "last_update_date": "2025-03-19T04:47:23.824Z",
- "from_date": "2025-03-19T04:47:23.824Z",
- "to_date": "infinity"
}Delete the association of Metastore Catalog Secret Tenant
Deletes the secret tenant association from a Metastore Catalog.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
Authorizations:
query Parameters
| metastore_catalog_secret_tenant_id required | integer <int64> Example: metastore_catalog_secret_tenant_id=1 The ID of the Metastore Catalog Secret Tenant to be deleted. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "message": "Metastore Catalog Secret tenant association deleted successfully."
}Retrieve Metastore Catalogs and their associated secret workspaces
Retrieves a paginated list of metastore catalogs along with their linked secret workspaces.
Supports filtering by secret workspace ID, workspace ID, catalog type (DATABRICKS UNITY, HIVE, AWS GLUE), metastore catalog ID, and name.
Authorizations:
query Parameters
| metastore_catalog_secret_workspace_id | integer <int64> Example: metastore_catalog_secret_workspace_id=1 The ID of the Metastore Catalog Secret Workspace to filter by. |
| workspace_id | integer <int64> Example: workspace_id=1 The ID of the Workspace to filter by. |
| catalog_type | string Enum: "DATABRICKS UNITY" "HIVE" "AWS GLUE" "AWS GLUE" Filter by catalog type. Only DATABRICKS UNITY, HIVE, and AWS GLUE are supported. |
| metastore_catalog_id | integer <int64> Example: metastore_catalog_id=1 The ID of the metastore catalog to filter by. |
| metastore_catalog_name | string Example: metastore_catalog_name=unity The name of the metastore catalog to filter by. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "metastore_catalog_id": 1,
- "name": "databricks_unity_metastore",
- "description": "This metastore refers to the metastore of databricks unity",
- "catalog_type": "DATABRICKS UNITY",
- "metastore_secrets_workspace": {
- "secrets_workspace_id": "1,",
- "name": "databricks_token_secret",
- "description": "This is a secret refereing to databricks unity token",
- "secret_type": "DATABRICKS UNITY TOKEN"
}, - "object_storage_secrets_workspace": {
- "secrets_workspace_id": "2,",
- "name": "databricks_cloud_storage_secret",
- "description": "This is a secret refereing to cloud storage of AWS",
- "secret_type": "AWS ACCESS SECRET KEY PAIR"
}, - "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 4,
- "username": "ya0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-03-19T04:47:23.824+00:00",
- "from_date": "2025-03-19T04:47:23.824+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 2,
- "limit": 1,
- "next_page": 2
}
}Attach a Metastore Catalog to a secret workspace
API to assoaciate Metastore Catalog with a secret workspace.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN to attach a metastore with secret workspace.
- The user must have MANAGE permission on the workspace to attach the secret to metastore
- metastore_secrets_workspace_id is only supported for catalog types DATABRICKS UNITY and HIVE. It is not supported for AWS_GLUE.
Authorizations:
Request Body schema: application/jsonrequired
The Metastore Catalog secret workspace to be attached.
| metastore_catalog_id required | integer <int64> |
| metastore_secrets_workspace_id required | integer <int64> Secret for the Metastore Catalog.
|
| object_storage_secrets_workspace_id | integer <int64> Secret for the cloud storage.
|
Responses
Request samples
- Payload
{- "metastore_catalog_id": 0,
- "metastore_secrets_workspace_id": 0,
- "object_storage_secrets_workspace_id": 0
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_secret_workspace_id": 1,
- "metastore_catalog_id": 1,
- "metastore_secrets_workspace_id": 1,
- "object_storage_secrets_workspace_id": 2,
- "created_by_user_id": 1,
- "modified_by_user_id": 1,
- "last_update_date": "2025-03-19T04:47:23.824Z",
- "from_date": "2025-03-19T04:47:23.824Z",
- "to_date": "infinity"
}Update a Metastore Catalog Secret Workspace association
Updates secret workspace association details for a Metastore Catalog.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
- The user must have MANAGE permission on the workspace to edit the secret for the association of the workspace secret to metastore
- metastore_secrets_workspace_id is only supported for catalog types DATABRICKS UNITY and HIVE. It is not supported for AWS_GLUE.
Authorizations:
Request Body schema: application/jsonrequired
The Metastore Catalog secret workspace to be updated.
| metastore_catalog_secret_workspace_id required | integer <int64> |
| metastore_secrets_workspace_id | integer or null <int64> Secret for the Metastore Catalog.
|
| object_storage_secrets_workspace_id | integer or null <int64> Secret for the cloud storage.
|
Responses
Request samples
- Payload
{- "metastore_catalog_secret_workspace_id": 0,
- "metastore_secrets_workspace_id": 0,
- "object_storage_secrets_workspace_id": 0
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_secret_workspace_id": 1,
- "metastore_catalog_id": 1,
- "metastore_secrets_workspace_id": 1,
- "object_storage_secrets_workspace_id": 2,
- "created_by_user_id": 1,
- "modified_by_user_id": 1,
- "last_update_date": "2025-03-19T04:47:23.824Z",
- "from_date": "2025-03-19T04:47:23.824Z",
- "to_date": "infinity"
}Delete the association of Metastore Catalog secret workspace
Deletes a secret workspace association from a Metastore Catalog.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN.
- The user must have MANAGE permission on the workspace to remove the association of the workspace secret to metastore
Authorizations:
query Parameters
| metastore_catalog_secret_workspace_id required | integer <int64> Example: metastore_catalog_secret_workspace_id=1 The ID of the Metastore Catalog Secret Workspace to be deleted. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "message": "Metastore Catalog Secret workspace association deleted successfully."
}Retrieve Metastore Catalogs and their associated secret users
API to get Metastore Catalogs secret user associated.
Authorizations:
query Parameters
| metastore_catalog_secret_user_id | integer <int64> Example: metastore_catalog_secret_user_id=1 The ID of the metastore catalog secret user to be retrieved. |
| user_id | integer <int64> Example: user_id=1 The ID of the user to be retrieved. |
| catalog_type | string Enum: "DATABRICKS UNITY" "HIVE" "AWS GLUE" Filter by catalog type. Only DATABRICKS UNITY, HIVE are supported. |
| metastore_catalog_id | integer <int64> Example: metastore_catalog_id=1 The ID of the metastore catalog to be retrieved. |
| metastore_catalog_name | string Example: metastore_catalog_name=unity The name of the metastore catalog to be retrieved. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "metastore_catalog_user_secret_id": 1,
- "name": "databricks_unity_metastore",
- "description": "This metastore refers to the metastore of databricks unity",
- "catalog_type": "DATABRICKS UNITY",
- "metastore_secrets_user": {
- "secrets_user_id": "1,",
- "name": "databricks_token_secret",
- "description": "This is a secret refereing to databricks unity token",
- "secret_type": "DATABRICKS UNITY TOKEN"
}, - "object_storage_secrets_user": {
- "secrets_user_id": "2,",
- "name": "databricks_cloud_storage_secret",
- "description": "This is a secret refereing to cloud storage of AWS",
- "secret_type": "AWS ACCESS SECRET KEY PAIR"
}, - "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 4,
- "username": "ya0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-03-19T04:47:23.824+00:00",
- "from_date": "2025-03-19T04:47:23.824+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 2,
- "limit": 1,
- "next_page": 2
}
}Attach a Metastore Catalog to a secret user
API to assoaciate Metastore Catalog with a secret user.
Rules:
- The user created the secret has the permission attach a metastore with secret user.
- metastore_secrets_user_id is only supported for catalog types DATABRICKS UNITY and HIVE. It is not supported for AWS_GLUE.
Authorizations:
Request Body schema: application/jsonrequired
The Metastore Catalog secret user to be attached.
| metastore_catalog_id required | integer <int64> |
| metastore_secrets_user_id required | integer <int64> Secret for the Metastore Catalog.
|
| object_storage_secrets_user_id | integer <int64> Secret for the cloud storage.
|
Responses
Request samples
- Payload
{- "metastore_catalog_id": 0,
- "metastore_secrets_user_id": 0,
- "object_storage_secrets_user_id": 0
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_secret_user_id": 1,
- "metastore_catalog_id": 1,
- "metastore_secrets_user_id": 1,
- "object_storage_secrets_user_id": 2,
- "created_by_user_id": 1,
- "modified_by_user_id": 1,
- "last_update_date": "2025-03-19T04:47:23.824Z",
- "from_date": "2025-03-19T04:47:23.824Z",
- "to_date": "infinity"
}Update a Metastore Catalog Secret User association
Updates the secret user association details for a Metastore Catalog.
Rules:
- The user attached the secret user to metastore catalog can only update the secret for the metastore
- metastore_secrets_user_id is only supported for catalog types DATABRICKS UNITY and HIVE. It is not supported for AWS_GLUE.
Authorizations:
Request Body schema: application/jsonrequired
The Metastore Catalog secret user to be updated.
| metastore_catalog_secret_user_id required | integer <int64> |
| metastore_secrets_user_id | integer or null <int64> Secret for the Metastore Catalog.
|
| object_storage_secrets_user_id | integer or null <int64> Secret for the cloud storage.
|
Responses
Request samples
- Payload
{- "metastore_catalog_secret_user_id": 0,
- "metastore_secrets_user_id": 0,
- "object_storage_secrets_user_id": 0
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "metastore_catalog_secret_user_id": 1,
- "metastore_catalog_id": 1,
- "metastore_secrets_user_id": 1,
- "object_storage_secrets_user_id": 2,
- "created_by_user_id": 1,
- "modified_by_user_id": 1,
- "last_update_date": "2025-03-19T04:47:23.824Z",
- "from_date": "2025-03-19T04:47:23.824Z",
- "to_date": "infinity"
}Delete the association of Metastore Catalog secret user
Deletes a secret user association from a Metastore Catalog.
Rules:
- The user must have one of the following roles: PLATFORM ADMIN, ADMIN as well as the user attached the secret to metastore can delete the association of secret user to metastore
Authorizations:
query Parameters
| metastore_catalog_secret_user_id | integer <int64> Example: metastore_catalog_secret_user_id=1 The ID of the Metastore Catalog secret user to be deleted. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "message": "Metastore Catalog Secret user association deleted successfully."
}Get list of catalogs
Retrieves a list of catalogs available under the specified metastore catalog for the given workspace.
Authorizations:
path Parameters
| metastore_catalog_id required | integer <int64> The ID of the Metastore Catalog to retrieve catalogs from. |
query Parameters
| workspace_id required | integer <int64> |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "name": "catalog_name",
- "owner": "user@example.com",
- "storage_root": "s3://example-bucket/path/to/root",
- "storage_location": "s3://example-bucket/path/to/catalog/location",
- "full_name": "catalog_name",
- "created_at": 1746006869220,
- "updated_at": 1746006869220,
- "updated_by": "user@example.com",
- "type": "CATALOG"
}
]Get schemas for catalog
Retrieves a list of schemas available under the specified metastore catalog for the given workspace.
Authorizations:
path Parameters
| metastore_catalog_id required | string The ID of the Metastore Catalog to retrieve schemas from. |
query Parameters
| workspace_id required | string The ID of the workspace to retrieve schemas for. |
| catalog_name | string The name of the catalog to retrieve schemas from. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "name": "default",
- "catalog_name": "bronze",
- "owner": "user@example.com",
- "comment": "Default schema (auto-created)",
- "full_name": "bronze.default",
- "created_at": 1742996088010,
- "created_by": "user@example.com",
- "updated_at": 1742996088010,
- "updated_by": "user@example.com",
- "type": "SCHEMA"
}
]List tables in schema
Retrieves a list of tables available under the specified metastore catalog and schema for the given workspace.
Authorizations:
path Parameters
| metastore_catalog_id required | string The ID of the Metastore Catalog to retrieve tables from. |
query Parameters
| workspace_id required | string The ID of the workspace to retrieve tables for. |
| catalog_name | string The name of the catalog to retrieve tables from. |
| schema_name required | string The name of the schema to retrieve tables from. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "name": "tables",
- "catalog_name": "bronze",
- "schema_name": "information_schema",
- "owner": "System user",
- "full_name": "bronze.information_schema.tables",
- "created_at": 1742996088198,
- "created_by": "System user",
- "updated_at": 1742996088198,
- "updated_by": "System user",
- "type": "TABLE"
}
]Get list of tables full names
Retrieves a list of table summaries available under the specified metastore catalog for the given workspace.
Authorizations:
path Parameters
| metastore_catalog_id required | string The ID of the Metastore Catalog to retrieve table summaries from. |
query Parameters
| workspace_id required | string The ID of the workspace to retrieve table summaries for. |
| catalog_name | string The name of the catalog to retrieve table summaries from. |
| cached_tables | Array of strings A list of table names to filter the results. If provided, only these tables will be included in the response. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- "bronze.information_schema.tables"
]DDL of a table
Retrieves the DDL (Data Definition Language) for a list of tables in the specified metastore catalog for a given workspace.
Authorizations:
path Parameters
| metastore_catalog_id required | string The ID of the Metastore Catalog to retrieve table DDLs from. |
query Parameters
| workspace_id required | string The ID of the workspace to retrieve table DDLs for. |
Request Body schema: application/jsonrequired
Responses
Request samples
- Payload
[- "string"
]Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "property1": "string",
- "property2": "string"
}
]Get list of columns for a table
Retrieves a list of columns available in a specific table under the specified metastore catalog and schema for the given workspace.
Authorizations:
path Parameters
| metastore_catalog_id required | string The ID of the Metastore Catalog to retrieve columns from. |
query Parameters
| workspace_id required | string The ID of the workspace to retrieve columns for. |
| catalog_name | string The name of the catalog to retrieve columns from. |
| schema_name required | string The name of the schema to retrieve columns from. |
| table_name required | string The name of the table to retrieve columns from. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "name": "grantor",
- "type_text": "string",
- "type_name": "STRING",
- "position": 0,
- "type": "INT"
}
]List functions in schema
Retrieves a list of functions available under the specified metastore catalog and schema for the given workspace.
Authorizations:
path Parameters
| metastore_catalog_id required | string The ID of the Metastore Catalog to retrieve functions from. |
query Parameters
| workspace_id required | string The ID of the workspace to retrieve functions for. |
| catalog_name | string The name of the catalog to retrieve functions from. |
| schema_name | string The name of the schema to retrieve functions from. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "name": "my_function",
- "catalog_name": "sample_catalog",
- "schema_name": "default_schema",
- "data_type": "STRING",
- "full_data_type": "STRING",
- "routine_body": "SQL",
- "routine_definition": "CONCAT(first_name, ' ', last_name)",
- "parameter_style": "S",
- "is_deterministic": true,
- "sql_data_access": "CONTAINS_SQL",
- "security_type": "DEFINER",
- "specific_name": "my_function",
- "owner": "owner@example.com",
- "properties": "{\"sqlConfig.spark.sql.ansi.enabled\":\"true\",\"sqlConfig.spark.sql.sources.default\":\"delta\"}\n",
- "metastore_id": "24820380-2435-3444-6567-5893393",
- "full_name": "sample_catalog.default_schema.my_function",
- "created_at": 1700000000000,
- "created_by": "creator@example.com",
- "updated_at": 1700000500000,
- "updated_by": "updater@example.com",
- "function_id": "6838900-79004-8568-9099-235512",
- "securable_type": "FUNCTION",
- "securable_kind": "FUNCTION_STANDARD",
- "icon_type": "FUNCTION"
}
]List volumes in schema
Retrieves a list of volumes available under the specified metastore catalog and schema for the given workspace.
Authorizations:
path Parameters
| metastore_catalog_id required | string The ID of the Metastore Catalog to retrieve volumes from. |
query Parameters
| workspace_id required | string The ID of the workspace to retrieve volumes for. |
| catalog_name | string The name of the catalog to retrieve volumes from. |
| schema_name | string The name of the schema to retrieve volumes from. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "name": "sales_data_volume",
- "catalog_name": "enterprise_catalog",
- "schema_name": "finance_schema",
- "resource_name": "/metastores/13b97132-f334-405b-ad70-0a1e4a20d587/volumes/5202395f-24aa-4eb0-ab12-02158edbfe6a",
- "volume_type": "MANAGED",
- "storage_location": "abfss://databricks-metastore@corpdatabricks.dfs.core.windows.net/metastore/13b97132-f334-405b-ad70-0a1e4a20d587/volumes/5202395f-24aa-4eb0-ab12-02158edbfe6a",
- "owner": "data.engineer@company.com",
- "comment": "Volume storing quarterly sales data in Parquet format",
- "full_name": "enterprise_catalog.finance_schema.sales_data_volume",
- "volume_id": "5202395f-24aa-4eb0-ab12-02158edbfe6a",
- "metastore_id": "13b97132-f334-405b-ad70-0a1e4a20d587",
- "created_at": 1734086400000,
- "created_by": "data.engineer@company.com",
- "updated_at": 1734172800000,
- "updated_by": "lead.dataops@company.com",
- "securable_type": "VOLUME",
- "securable_kind": "VOLUME_STANDARD",
- "schema_id": "a81f1e1a-2bf3-4dd0-8afe-96fa5da49981",
- "catalog_id": "e4b7735d-fc8c-4716-903a-d6896474b355",
- "icon_type": "VOLUME"
}
]Get all the cluster configurations.
Retrieves a paginated list of cluster configurations available in the system. You can filter results by cloud providers, compute types, machine architecture, and optionally request all data. Pagination parameters like limit and offset are supported. This endpoint provides an overview of all the cluster configurations set up for use.
Authorizations:
query Parameters
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| compute_type | string Enum: "compute_optimized" "memory_optimized" "general_purpose" "gpu_accelerated" "storage_optimized" "custom_compute" Compute type used for filtering. |
| architecture_type | string Enum: "x86_64" "aarch64" Machine Architecture Type used for filtering. |
| all | boolean Default: false Enum: true false A boolean that can be set to return all the data. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "cluster_conf_id": 60,
- "cluster_conf_name": "Standard_F2s_v24",
- "description": null,
- "machine_type_category": "compute_optimized",
- "machine_type": {
- "machine_type_id": 214,
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure"
}, - "name": "Standard_F2s_v24",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "vCPUs": 2,
- "memory": "4 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon® Platinum 8168 (SkyLake)"
], - "cpu_min_frequency_GHz": [
- "2.7"
], - "cpu_max_frequency_GHz": [
- "3.7"
], - "has_local_disk": true,
- "local_disk_size_GB": 16,
- "local_num_of_disks": 1,
- "local_disk_bus_type": {
- "local_disk_bus_type_id": 0,
- "local_disk_bus_type": "SCSI"
}, - "local_disk_throughput_MB": null,
- "machine_price_ycu": 0.73
}, - "machine_volume_conf": null,
- "created_by": null,
- "modified_by": null,
- "last_update_date": "2024-04-05T14:17:18.031635+00:00",
- "from_date": "2024-04-05T14:17:18.031635+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 261,
- "total_pages": 261,
- "limit": 1,
- "next_page": 2
}
}Search cluster configurations based on configuration name.
Retrieves a list of cluster configurations filtered by a specified configuration name. Additional filters like cloud provider, compute type, and machine architecture can be applied. Supports pagination with page limit and offset parameters. Use this endpoint to find cluster configurations matching a name pattern.
Authorizations:
query Parameters
| cluster_conf_name required | string Specifies the name of the cluster configuration for the search. |
| cloud_provider | string Enum: "GCP" "AWS" "Azure" Cloud provider used for filtering. |
| compute_type | string Enum: "compute_optimized" "memory_optimized" "general_purpose" "gpu_accelerated" "storage_optimized" "custom_compute" Compute type used for filtering. |
| architecture_type | string Enum: "x86_64" "aarch64" Machine Architecture Type used for filtering. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "cluster_conf_id": 60,
- "cluster_conf_name": "Standard_F2s_v24",
- "description": null,
- "machine_type_category": "compute_optimized",
- "machine_type": {
- "machine_type_id": 214,
- "cloud_provider": {
- "cloud_provider_id": 2,
- "name": "Azure"
}, - "name": "Standard_F2s_v24",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "vCPUs": 2,
- "memory": "4 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon® Platinum 8168 (SkyLake)"
], - "cpu_min_frequency_GHz": [
- "2.7"
], - "cpu_max_frequency_GHz": [
- "3.7"
], - "has_local_disk": true,
- "local_disk_size_GB": 16,
- "local_num_of_disks": 1,
- "local_disk_bus_type": {
- "local_disk_bus_type_id": 0,
- "local_disk_bus_type": "SCSI"
}, - "local_disk_throughput_MB": null,
- "machine_price_ycu": 0.73
}, - "machine_volume_conf": null,
- "created_by": null,
- "modified_by": null,
- "last_update_date": "2024-04-05T14:17:18.031635+00:00",
- "from_date": "2024-04-05T14:17:18.031635+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 261,
- "total_pages": 261,
- "limit": 1,
- "next_page": 2
}
}Create a new cluster configuration.
Adds a new cluster configuration with the specified configuration details. The request body must include all required parameters defining the cluster configuration. If successful, returns the created cluster configuration.
Authorizations:
Request Body schema: application/jsonrequired
Cluster configuration to be added.
| name required | string non-empty The
|
| description | string or null non-empty |
| machine_type_id required | integer <int64> |
| volume_conf_id | integer <int64> |
Responses
Request samples
- Payload
{- "name": "yeedu_cluster",
- "description": "Cluster Configurations test",
- "machine_type_id": 76,
- "volume_conf_id": 1
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "cluster_conf_id": "1",
- "name": "yeedu_cluster",
- "description": "Cluster Configurations test",
- "machine_type_category_id": "1",
- "machine_type_id": "1",
- "volume_conf_id": "1",
- "created_by_user_id": "3",
- "modified_by_user_id": "3",
- "last_update_date": "2023-03-17T10:20:21.627Z",
- "from_date": "2023-03-17T10:20:21.627Z",
- "to_date": null
}Get details of a specific cluster configuration.
Retrieves detailed information of a cluster configuration filtered by either its unique ID or name. Use this endpoint to fetch full configuration data of an individual cluster configuration.
Authorizations:
query Parameters
| cluster_conf_id | integer <int64> Cluster configuration ID used for filtering. |
| cluster_conf_name | string Cluster configuration name used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "cluster_conf_id": 261,
- "cluster_conf_name": "cluster_conf",
- "description": "creating aws cluster configuration",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 89,
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services"
}, - "name": "m5d.xlarge",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "vCPUs": 4,
- "memory": "16 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xenon Platinum 8175"
], - "cpu_min_frequency_GHz": [
- "2.5"
], - "cpu_max_frequency_GHz": [
- "3.1"
], - "has_local_disk": true,
- "local_disk_size_GB": 150,
- "local_num_of_disks": 1,
- "local_disk_bus_type": {
- "local_disk_bus_type_id": 1,
- "local_disk_bus_type": "NVME"
}, - "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.27
}, - "machine_volume_conf": {
- "volume_conf_id": 25,
- "name": "yeedu_aws_volume",
- "encrypted": true,
- "size": 200,
- "disk_type": {
- "disk_type_id": 3,
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS"
}, - "name": "gp2",
- "has_fixed_size": false,
- "min_size": 1,
- "max_size": 16000
}, - "machine_volume_num": 1,
- "machine_volume_strip_num": 1,
- "disk_iops": null,
- "disk_throughput_MB": null,
- "disk_num": 1,
- "disk_size": 200
}, - "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-04-12T13:27:29.754832+00:00",
- "from_date": "2024-04-12T13:27:29.754832+00:00",
- "to_date": "infinity"
}Update details of a specific cluster configuration.
Updates the details of a specific cluster configuration identified by its ID or name. The request body should include the updated configuration fields. Returns the updated cluster configuration on success.
Authorizations:
query Parameters
| cluster_conf_id | integer <int64> Cluster configuration ID used for modification. |
| cluster_conf_name | string Cluster configuration name used for modification. |
Request Body schema: application/jsonrequired
Cluster configuration details to be modified.
| name | string non-empty The
|
| description | string or null non-empty |
| machine_type_id | integer <int64> |
| volume_conf_id | integer or null <int64> |
Responses
Request samples
- Payload
{- "name": "Yeedu Cluster Configuration",
- "description": "Cluster Configurations test",
- "machine_type_id": 75,
- "volume_conf_id": 1
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "cluster_conf_id": "1",
- "name": "yeedu_cluster",
- "description": "Cluster Configurations test",
- "machine_type_category_id": "1",
- "machine_type_id": "1",
- "volume_conf_id": "1",
- "created_by_user_id": "3",
- "modified_by_user_id": "3",
- "last_update_date": "2023-03-17T10:20:21.627Z",
- "from_date": "2023-03-17T10:20:21.627Z",
- "to_date": null
}Delete a specific cluster configuration.
Deletes a specific cluster configuration identified by its ID or name. If successful, returns a confirmation message.
Authorizations:
query Parameters
| cluster_conf_id | integer <int64> Cluster configuration ID used for deletion. |
| cluster_conf_name | string Cluster configuration name used for deletion. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted Cluster Configuration Id: 1"
}Get all the cluster instances.
Retrieves a list of cluster instances.
Authorizations:
query Parameters
| cluster_status | Array of strings Items Enum: "INITIATING" "RUNNING" "STOPPING" "STOPPED" "DESTROYING" "DESTROYED" "ERROR" "RESIZING_UP" "RESIZING_DOWN" Specifies the cluster instance statuses to be used as a filter. |
| cluster_conf_id | integer <int64> Cluster configuration ID used for filtering. |
| cluster_conf_name | string Cluster configuration name used for filtering. |
| enable | boolean Enum: true false Specifies which clusters to list.
Note: If unspecified, all clusters (both active and disabled) will be listed. |
| cloud_providers | Array of strings Items Enum: "GCP" "AWS" "Azure" Specifies the cloud providers to be used as a filter. |
| cluster_types | Array of strings Items Enum: "YEEDU" "STANDALONE" "CLUSTER" Specifies the cluster types to be used as a filter. |
| spark_infra_version_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of runtime version IDs to filter on. |
| machine_type_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of machine type IDs to filter on. |
| created_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of created by user IDs to filter on. |
| modified_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of modified by user IDs to filter on. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "cluster_id": 21,
- "name": "test-cluster",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "clean_up_timeout": 240,
- "total_ycu": 597.237,
- "cloud_env": {
- "cloud_env_id": 1,
- "name": "aws_load_test_cloud_env",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS"
}
}, - "cluster_conf": {
- "cluster_conf_id": 156,
- "cluster_conf_name": "m5dn.xlarge",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 96,
- "name": "m5dn.xlarge",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "vCPUs": 4,
- "memory": "16 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Platinum 8259"
], - "cpu_min_frequency_GHz": [
- "2.5"
], - "cpu_max_frequency_GHz": [
- "3.5"
], - "has_local_disk": true,
- "local_disk_size_GB": 150,
- "local_num_of_disks": 1,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.27
}, - "machine_volume_conf": null
}, - "machine_volume_conf": {
- "volume_conf_id": 123,
- "name": "volume_1",
- "machine_volume_num": 2,
- "machine_volume_strip_num": 1,
- "size": 100,
- "disk_iops": 5000,
- "disk_throughput_MB": 200,
- "disk_type": {
- "disk_type_id": 5,
- "name": "SSD",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS"
}, - "has_fixed_size": true,
- "min_size": 50,
- "max_size": 500,
- "has_fixed_throughput": true,
- "min_throughput": 100,
- "max_throughput": 1000,
- "has_fixed_iops": true,
- "min_iops": 1000,
- "max_iops": 10000
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}, - "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 4,
- "username": "ya0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-04-28T14:55:32.795+00:00",
- "from_date": "2024-04-19T15:20:49.734293+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 2,
- "limit": 1,
- "next_page": 2
}
}Search cluster instance based on name.
Retrieves a list of cluster instances filtered by cluster name and optional filters such as status, configuration ID or name, and active/disabled status.
Authorizations:
query Parameters
| cluster_name required | string Cluster instance name used for filtering. |
| cluster_status | Array of strings Items Enum: "INITIATING" "RUNNING" "STOPPING" "STOPPED" "DESTROYING" "DESTROYED" "ERROR" "RESIZING_UP" "RESIZING_DOWN" Specifies the cluster instance statuses to be used as a filter. |
| cluster_conf_id | integer <int64> Cluster configuration ID used for filtering. |
| cluster_conf_name | string Cluster configuration name used for filtering. |
| enable | boolean Enum: true false Specifies which clusters to list.
Note: If unspecified, all clusters (both active and disabled) will be listed. |
| cloud_providers | Array of strings Items Enum: "GCP" "AWS" "Azure" Specifies the cloud providers to be used as a filter. |
| cluster_types | Array of strings Items Enum: "YEEDU" "STANDALONE" "CLUSTER" Specifies the cluster types to be used as a filter. |
| spark_infra_version_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of runtime version IDs to filter on. |
| machine_type_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of machine type IDs to filter on. |
| created_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of created by user IDs to filter on. |
| modified_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of modified by user IDs to filter on. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "cluster_id": 21,
- "name": "test-cluster",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "clean_up_timeout": 240,
- "total_ycu": 597.237,
- "cloud_env": {
- "cloud_env_id": 1,
- "name": "aws_load_test_cloud_env",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS"
}
}, - "cluster_conf": {
- "cluster_conf_id": 156,
- "cluster_conf_name": "m5dn.xlarge",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 96,
- "name": "m5dn.xlarge",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "vCPUs": 4,
- "memory": "16 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Platinum 8259"
], - "cpu_min_frequency_GHz": [
- "2.5"
], - "cpu_max_frequency_GHz": [
- "3.5"
], - "has_local_disk": true,
- "local_disk_size_GB": 150,
- "local_num_of_disks": 1,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.27
}, - "machine_volume_conf": null
}, - "machine_volume_conf": {
- "volume_conf_id": 123,
- "name": "volume_1",
- "machine_volume_num": 2,
- "machine_volume_strip_num": 1,
- "size": 100,
- "disk_iops": 5000,
- "disk_throughput_MB": 200,
- "disk_type": {
- "disk_type_id": 5,
- "name": "SSD",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS"
}, - "has_fixed_size": true,
- "min_size": 50,
- "max_size": 500,
- "has_fixed_throughput": true,
- "min_throughput": 100,
- "max_throughput": 1000,
- "has_fixed_iops": true,
- "min_iops": 1000,
- "max_iops": 10000
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}, - "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 4,
- "username": "ya0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-04-28T14:55:32.795+00:00",
- "from_date": "2024-04-19T15:20:49.734293+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 2,
- "limit": 1,
- "next_page": 2
}
}Calculate Spark driver memory.
Calculates the Spark driver memory based on provided machine memory and the maximum parallel Spark job execution per instance.
Authorizations:
query Parameters
| machine_memory required | integer <int64> Total machine memory in GB |
| max_parallel_spark_job_execution_per_instance required | integer <int64> Maximum parallel Spark job execution per instance |
Responses
Response samples
- 200
- 400
- 401
- 404
- 409
- 500
{- "message": "For each job 1G of Spark driver memory will be allocated."
}Create a new cluster instance.
Create a new cluster instance with provided configuration.
Authorizations:
Request Body schema: application/jsonrequired
Cluster instance configuration to be added.
| name required | string non-empty The
|
| description | string or null non-empty |
| idle_timeout_ms | integer <int64> |
| labels | object |
| is_spot_instance required | boolean Default: false |
| is_turbo_enabled | boolean Default: false |
| is_cuda_enabled | boolean Default: false |
| enable_public_ip required | boolean Default: false |
| block_project_ssh_keys required | boolean Default: false |
| bootstrap_shell_script | string or null non-empty |
| cloud_env_id required | integer <int64> |
| object_storage_manager_id | integer or null <int64> |
| cluster_conf_id required | integer <int64> |
object | |
| spark_infra_version_id required | integer <int64> |
| metastore_catalog_id | integer or null <int64> |
object | |
| cluster_type required | string Enum: "YEEDU" "STANDALONE" "CLUSTER" |
| min_instances | integer or null <int64> >= 1 |
| max_instances | integer or null <int64> [ 1 .. 30 ] |
| clean_up_timeout | integer <int64> |
| keep_scratch_disk | boolean or null |
| disk_type_id | integer <int64> |
| disk_iops | integer <int64> |
| disk_throughput_MB | integer <int64> |
| size | integer <int64> |
| number_of_disks | integer <int64> >= 1 |
Responses
Request samples
- Payload
{- "name": "yeedu_instance",
- "description": "Test yeedu instance",
- "idle_timeout_ms": 1200000,
- "labels": {
- "resource": "yeedu"
}, - "is_spot_instance": false,
- "enable_public_ip": false,
- "block_project_ssh_keys": false,
- "cloud_env_id": 1,
- "object_storage_manager_id": 1,
- "cluster_conf_id": 1,
- "spark_infra_version_id": 0,
- "is_turbo_enabled": false,
- "is_cuda_enabled": false,
- "spark_config": {
- "conf": [
- "key1 value1",
- "key2 value2"
], - "packages": [
- "string"
], - "repositories": [
- "string"
], - "jars": [
- "string"
], - "archives": [
- "string"
], - "env_var": [
- "key1=value1",
- "key2=value2"
], - "files": [
- "string"
], - "py-files": [
- "string"
], - "conf_secret": {
- "key": "value"
}, - "env_var_secret": {
- "key": "value"
}
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5
}, - "cluster_type": "YEEDU",
- "min_instances": 1,
- "max_instances": 3,
- "clean_up_timeout": 240,
- "disk_type_id": 8,
- "disk_iops": 3000,
- "disk_throughput_MB": 126,
- "size": 125,
- "number_of_disks": 2
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "cluster_id": "27",
- "name": "yeedu_instance_new_7",
- "description": "Test yeedu instance",
- "cloud_env_id": "3",
- "idle_timeout_ms": "1200000",
- "labels": {
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "15a095ee-0fc0-47ec-a44a-b6d9c9455a42"
}, - "is_spot_instance": false,
- "enable_public_ip": false,
- "is_turbo_enabled": false,
- "is_cuda_enabled": false,
- "block_project_ssh_keys": false,
- "bootstrap_shell_script": null,
- "object_storage_manager_id": "3",
- "cluster_conf_id": "76",
- "spark_config": {
- "conf": [
- "key1 value1",
- "key2 value2"
], - "packages": [
- "string"
], - "repositories": [
- "string"
], - "jars": [
- "string"
], - "archives": [
- "string"
], - "env_var": [
- "key1=value1",
- "key2=value2"
], - "conf_secret": [
- "key"
], - "env_var_secret": [
- "key"
], - "files": [
- "string"
], - "py-files": [
- "string"
]
}, - "metastore_catalog_id": null,
- "spark_infra_version_id": "0",
- "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}, - "cluster_type": "YEEDU",
- "min_instances": 1,
- "max_instances": 3,
- "clean_up_timeout": "240",
- "keep_scratch_disk": true,
- "machine_volume_config": {
- "machine_volume_config_id": "41",
- "machine_volume_config_name": "cluster_volume_99f47016-b18a-4f31-8efd-ff02f6a97964",
- "machine_volume_config_description": null,
- "size": "4",
- "encrypted": false,
- "machine_volume_conf_num": 2,
- "machine_volume_conf_strip": 2,
- "number_of_disks": 2,
- "disk_iops": 3000,
- "disk_throughput_MB": 125,
- "disk_type": {
- "disk_type_id": "9",
- "disk_name": "UltraSSD",
- "cloud_provider_id": "2",
- "has_fixed_size": false,
- "min_size": 4,
- "max_size": 65536,
- "has_fixed_iops": false,
- "min_iops": 3000,
- "max_iops": 400000,
- "has_fixed_throughput": false,
- "min_throughput": 125,
- "max_throughput": 10000
}
}, - "tenant_id": "15a095ee-0fc0-47ec-a44a-b6d9c9455a42",
- "created_by_user_id": "9",
- "modified_by_user_id": "9",
- "last_update_date": "2025-06-11T13:27:04.644Z",
- "from_date": "2025-06-11T13:27:04.644Z",
- "to_date": null
}Get details of a specific cluster instance.
Retrieve cluster instance details filtered by ID or name.
Authorizations:
query Parameters
| cluster_id | integer <int64> Cluster instance ID used for filtering. |
| cluster_name | string Cluster instance name used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "cluster_id": 21,
- "name": "test-cluster",
- "description": "Created for jobs load test",
- "labels": {
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62"
}, - "idle_timeout_ms": 1800000,
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "is_spot_instance": false,
- "enable_public_ip": false,
- "block_project_ssh_keys": false,
- "min_instances": 1,
- "max_instances": 1,
- "clean_up_timeout": 240,
- "total_ycu": 597.237,
- "is_turbo_enabled": false,
- "is_cuda_enabled": false,
- "bootstrap_shell_script": "#!/bin/bash\n \necho \"10.10.30.196 test.yeedu.com\" >> /etc/hosts\n",
- "cluster_conf": {
- "cluster_conf_id": 156,
- "cluster_conf_name": "m5dn.xlarge",
- "description": null,
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 96,
- "name": "m5dn.xlarge",
- "machine_architecture_type": {
- "machine_architecture_type_id": 0,
- "machine_architecture_type": "x86_64"
}, - "vCPUs": 4,
- "memory": "16 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Platinum 8259"
], - "cpu_min_frequency_GHz": [
- "2.5"
], - "cpu_max_frequency_GHz": [
- "3.5"
], - "has_local_disk": true,
- "local_disk_size_GB": 150,
- "local_num_of_disks": 1,
- "local_disk_bus_type": {
- "local_disk_bus_type_id": 1,
- "local_disk_bus_type": "NVME"
}, - "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.27
}, - "machine_volume_conf": null
}, - "machine_volume_conf": {
- "volume_conf_id": 123,
- "name": "volume_1",
- "machine_volume_num": 2,
- "machine_volume_strip_num": 1,
- "size": 100,
- "disk_iops": 5000,
- "disk_throughput_MB": 200,
- "disk_type": {
- "disk_type_id": 5,
- "name": "SSD",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS"
}, - "has_fixed_size": true,
- "min_size": 50,
- "max_size": 500,
- "has_fixed_throughput": true,
- "min_throughput": 100,
- "max_throughput": 1000,
- "has_fixed_iops": true,
- "min_iops": 1000,
- "max_iops": 10000
}
}, - "cloud_env": {
- "cloud_env_id": 1,
- "name": "aws_load_test_cloud_env",
- "description": "",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS",
- "description": "Provider for creating infrastructure on Amazon Web Services"
}, - "availabilty_zone": {
- "availabilty_zone_id": 103,
- "name": "us-east-2",
- "cloud_provider_id": 1,
- "region": "us-east-2",
- "description": "Ohio, US East"
}, - "machine_network": {
- "machine_network_conf_id": 1,
- "name": "aws_network",
- "description": "creating gcp network config",
- "network_project_id": "7986-5432-1098",
- "network_name": "eni-7986d54edd3c2fe1f",
- "network_tags": [
- "sg-0ca79acefcfd865fa"
], - "subnet": "subnet-9d87654a32105d42d",
- "machine_network_availability_zone": {
- "availability_zone_id": 103,
- "name": "us-east-2",
- "cloud_provider_id": 1,
- "region": "us-east-2",
- "description": "Ohio, US East"
}
}, - "cloud_project": "798654321098",
- "credentials_config": {
- "credential_config_id": 1,
- "name": "aws_genomic_security",
- "description": null,
- "credential_type": {
- "credential_type_id": 1,
- "name": "AWS Access Secret Key Pair",
- "cloud_provider_id": 1
}
}, - "boot_disk_image": {
- "boot_disk_image_id": 3,
- "name": "aws_ubuntu",
- "description": "Base image for AWS Ubuntu",
- "cloud_provider_id": 1,
- "linux_distro": {
- "linux_distro_id": 0,
- "distro_name": "UBUNTU",
- "distro_version": "20.04 LTS"
}, - "boot_disk_image": "ami-9d87654a32105d42d"
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "spark_config": {
- "conf": [ ],
- "packages": [ ],
- "repositories": [ ],
- "jars": [ ],
- "archives": [ ],
- "env_var": [ ],
- "conf_secret": null,
- "env_var_secret": null,
- "files": [ ],
- "py-files": [ ]
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}, - "object_storage_manager": {
- "object_storage_manager_id": 23,
- "name": "jobs-osm",
- "description": "jobs",
- "credentials_config": {
- "credential_config_id": 1,
- "name": "aws_genomic_security",
- "description": null,
- "credential_type_name": "AWS Access Secret Key Pair"
}, - "object_storage_bucket_name": "yeedu"
}, - "workflow_job_instance_details": {
- "workflow_job_instance_status": {
- "workflow_job_instance_id": 850897,
- "workflow_job_id": 850897,
- "status": "DONE",
- "from_date": "2024-04-29T08:43:27.119079+00:00",
- "to_date": "2024-04-29T08:44:01.959592+00:00"
}
}, - "tenant_id": "a6a9c5ea-57b6-4a1c-aa99-84645f675b62",
- "created_by": {
- "user_id": 4,
- "username": "ya0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2024-04-28T14:55:32.795+00:00",
- "from_date": "2024-04-19T15:20:49.734293+00:00",
- "to_date": "infinity"
}Update details of a specific cluster instance.
The following parameters are permitted for editing in different cluster states:
In all cluster states:
- name
- description
- idle_timeout_ms
- clean_up_timeout
Only when the cluster is in a DESTROYED/ERROR state:
- labels
- block_project_ssh_keys
- enable_public_ip
- bootstrap_shell_script
- cloud_env_id
- object_storage_manager_id
- cluster_conf_id
- spark_infra_version_id
- is_turbo_enabled
- is_cuda_enabled
- is_spot_instance
- engine_cluster_spark_config:
- max_parallel_spark_job_execution_per_instance
- num_of_workers
- spark_config:
- conf
- packages
- repositories
- jars
- archives
- env_var
- conf_secret
- env_var_secret
- files
- "py-files"
- min_instances
- max_instances
- keep_scratch_disk
- metastore_catalog_id
- disk_type_id
- disk_iops
- disk_throughput_MB
- size
- number_of_disks
Authorizations:
query Parameters
| cluster_id | integer <int64> Cluster instance ID used for modification. |
| cluster_name | string Cluster instance name used for modification. |
Request Body schema: application/jsonrequired
Cluster instance details to be modified.
| name | string non-empty The
|
| description | string or null non-empty |
| idle_timeout_ms | integer <int64> |
| labels | object |
| enable_public_ip | boolean |
| block_project_ssh_keys | boolean |
| bootstrap_shell_script | string or null non-empty |
| cloud_env_id | integer <int64> |
| cluster_conf_id | integer <int64> |
| is_turbo_enabled | boolean |
| is_cuda_enabled | boolean |
| is_spot_instance | boolean |
| object_storage_manager_id | integer or null <int64> |
object | |
| spark_infra_version_id | integer <int64> |
| metastore_catalog_id | integer or null <int64> |
object | |
| min_instances | integer or null <int64> >= 1 |
| max_instances | integer or null <int64> [ 1 .. 30 ] |
| clean_up_timeout | integer <int64> |
| keep_scratch_disk | boolean or null |
| disk_type_id | integer or null <int64> |
| disk_iops | integer <int64> |
| disk_throughput_MB | integer <int64> |
| size | integer <int64> |
| number_of_disks | integer <int64> >= 1 |
Responses
Request samples
- Payload
{- "name": "yeedu_instance",
- "description": "Test yeedu instance",
- "idle_timeout_ms": 1200000,
- "labels": {
- "resource": "yeedu"
}, - "enable_public_ip": false,
- "block_project_ssh_keys": false,
- "cloud_env_id": 1,
- "object_storage_manager_id": 1,
- "cluster_conf_id": 1,
- "spark_infra_version_id": 1,
- "is_turbo_enabled": false,
- "is_cuda_enabled": false,
- "is_spot_instance": false,
- "spark_config": {
- "conf": [
- "key1 value1",
- "key2 value2"
], - "packages": [
- "string"
], - "repositories": [
- "string"
], - "jars": [
- "string"
], - "archives": [
- "string"
], - "env_var": [
- "key1=value1",
- "key2=value2"
], - "conf_secret": {
- "key": "value"
}, - "env_var_secret": {
- "key": "value"
}, - "files": [
- "string"
], - "py-files": [
- "string"
]
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5
}, - "min_instances": 1,
- "max_instances": 3,
- "clean_up_timeout": 240,
- "disk_type_id": "5,",
- "disk_iops": "120,",
- "disk_throughput": "25,",
- "size": 125,
- "number_of_disks": 2
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "cluster_id": "27",
- "name": "yeedu_instance_new_7",
- "description": "Test yeedu instance",
- "cloud_env_id": "3",
- "idle_timeout_ms": "1200000",
- "labels": {
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "15a095ee-0fc0-47ec-a44a-b6d9c9455a42"
}, - "is_spot_instance": false,
- "enable_public_ip": false,
- "is_turbo_enabled": false,
- "is_cuda_enabled": false,
- "block_project_ssh_keys": false,
- "bootstrap_shell_script": null,
- "object_storage_manager_id": "3",
- "cluster_conf_id": "76",
- "spark_config": {
- "conf": [
- "key1 value1",
- "key2 value2"
], - "packages": [
- "string"
], - "repositories": [
- "string"
], - "jars": [
- "string"
], - "archives": [
- "string"
], - "env_var": [
- "key1=value1",
- "key2=value2"
], - "conf_secret": [
- "key"
], - "env_var_secret": [
- "key"
], - "files": [
- "string"
], - "py-files": [
- "string"
]
}, - "metastore_catalog_id": null,
- "spark_infra_version_id": "0",
- "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}, - "cluster_type": "YEEDU",
- "min_instances": 1,
- "max_instances": 3,
- "clean_up_timeout": "240",
- "keep_scratch_disk": true,
- "machine_volume_config": {
- "machine_volume_config_id": "41",
- "machine_volume_config_name": "cluster_volume_99f47016-b18a-4f31-8efd-ff02f6a97964",
- "machine_volume_config_description": null,
- "size": "4",
- "encrypted": false,
- "machine_volume_conf_num": 2,
- "machine_volume_conf_strip": 2,
- "number_of_disks": 2,
- "disk_iops": 3000,
- "disk_throughput_MB": 125,
- "disk_type": {
- "disk_type_id": "9",
- "disk_name": "UltraSSD",
- "cloud_provider_id": "2",
- "has_fixed_size": false,
- "min_size": 4,
- "max_size": 65536,
- "has_fixed_iops": false,
- "min_iops": 3000,
- "max_iops": 400000,
- "has_fixed_throughput": false,
- "min_throughput": 125,
- "max_throughput": 10000
}
}, - "tenant_id": "15a095ee-0fc0-47ec-a44a-b6d9c9455a42",
- "created_by_user_id": "9",
- "modified_by_user_id": "9",
- "last_update_date": "2025-06-11T13:27:04.644Z",
- "from_date": "2025-06-11T13:27:04.644Z",
- "to_date": null
}Start a cluster instance.
Start a cluster instance by ID or name. Only cluster instances in the STOPPED state can be started.
Authorizations:
Request Body schema: application/jsonrequired
Cluster instance to be started.
| cluster_id | integer <int64> |
| cluster_name | string non-empty |
Responses
Request samples
- Payload
{- "cluster_id": 1
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "CosiStart": {
- "workflow_job_id": "1",
- "workflow_job_instance_id": "1",
- "engine_cluster_instance_id": "1"
}
}Stop a cluster instance.
Stop a cluster instance by ID or name. Only cluster instances in the RUNNING state can be stopped.
Authorizations:
Request Body schema: application/jsonrequired
Cluster instance to be stopped.
| cluster_id | integer <int64> |
| cluster_name | string non-empty |
Responses
Request samples
- Payload
{- "cluster_id": 1,
- "keep_scratch_disk": false
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "CosiStop": {
- "workflow_job_id": "1",
- "workflow_job_instance_id": "1",
- "engine_cluster_instance_id": 1
}
}Destroy a cluster instance.
Destroy a cluster instance by ID or name. Only cluster instances in the RUNNING, STOPPED, or ERROR states can be destroyed.
Authorizations:
Request Body schema: application/jsonrequired
Cluster instance to be destroyed.
| cluster_id | integer <int64> |
| cluster_name | string non-empty |
Responses
Request samples
- Payload
{- "cluster_id": 1
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "CosiDestroy": {
- "workflow_job_id": 1,
- "workflow_job_instance_id": 1,
- "engine_cluster_instance_id": 1
}
}Disable a cluster instance.
Disable a cluster instance by ID or name. Only cluster instances in the DESTROYED state can be disabled.
Authorizations:
Request Body schema: application/jsonrequired
Cluster instance to be disabled.
| cluster_id | integer <int64> |
| cluster_name | string non-empty |
Responses
Request samples
- Payload
{- "cluster_id": 1
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "message": "The cluster_id : 1 has been disabled."
}Get all the cluster instance states.
Retrieve a list of cluster instance events (states) filtered by ID or name.
Authorizations:
query Parameters
| cluster_id | integer <int64> Cluster instance ID used for filtering. |
| cluster_name | string Cluster instance name used for filtering. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "cluster_status_id": 1143,
- "cluster_status": "RUNNING",
- "current_node_size": 1,
- "created_by": null,
- "start_time": "2024-04-29T08:44:01.959592+00:00",
- "end_time": "infinity"
}, - {
- "cluster_status_id": 1141,
- "cluster_status": "INITIATING",
- "current_node_size": 1,
- "created_by": null,
- "start_time": "2024-04-29T08:43:27.119079+00:00",
- "end_time": "2024-04-29T08:44:01.959592+00:00"
}, - {
- "cluster_status_id": 1139,
- "cluster_status": "STOPPED",
- "current_node_size": 0,
- "created_by": null,
- "start_time": "2024-04-29T07:14:31.59839+00:00",
- "end_time": "2024-04-29T08:43:27.119079+00:00"
}, - {
- "cluster_status_id": 1138,
- "cluster_status": "STOPPING",
- "current_node_size": 0,
- "created_by": "ysu0000@yeedu.io",
- "start_time": "2024-04-29T07:12:58.081903+00:00",
- "end_time": "2024-04-29T07:14:31.59839+00:00"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 34,
- "total_pages": 9,
- "limit": 4,
- "next_page": 2
}
}Enable a cluster instance.
Enable a cluster instance by ID or name. Only cluster instances that are disabled can be enabled.
Authorizations:
Request Body schema: application/jsonrequired
Cluster instance to be enabled.
| cluster_id | integer <int64> |
| cluster_name | string non-empty |
Responses
Request samples
- Payload
{- "cluster_id": 1
}Response samples
- 200
- 400
- 401
- 403
- 404
- 500
{- "cluster_id": "27",
- "name": "yeedu_instance_new_7",
- "description": "Test yeedu instance",
- "cloud_env_id": "3",
- "idle_timeout_ms": "1200000",
- "labels": {
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "15a095ee-0fc0-47ec-a44a-b6d9c9455a42"
}, - "is_spot_instance": false,
- "enable_public_ip": false,
- "is_turbo_enabled": false,
- "is_cuda_enabled": false,
- "block_project_ssh_keys": false,
- "bootstrap_shell_script": null,
- "object_storage_manager_id": "3",
- "cluster_conf_id": "76",
- "spark_config": {
- "conf": [
- "key1 value1",
- "key2 value2"
], - "packages": [
- "string"
], - "repositories": [
- "string"
], - "jars": [
- "string"
], - "archives": [
- "string"
], - "env_var": [
- "key1=value1",
- "key2=value2"
], - "conf_secret": [
- "key"
], - "env_var_secret": [
- "key"
], - "files": [
- "string"
], - "py-files": [
- "string"
]
}, - "metastore_catalog_id": null,
- "spark_infra_version_id": "0",
- "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}, - "cluster_type": "YEEDU",
- "min_instances": 1,
- "max_instances": 3,
- "clean_up_timeout": "240",
- "keep_scratch_disk": true,
- "machine_volume_config": {
- "machine_volume_config_id": "41",
- "machine_volume_config_name": "cluster_volume_99f47016-b18a-4f31-8efd-ff02f6a97964",
- "machine_volume_config_description": null,
- "size": "4",
- "encrypted": false,
- "machine_volume_conf_num": 2,
- "machine_volume_conf_strip": 2,
- "number_of_disks": 2,
- "disk_iops": 3000,
- "disk_throughput_MB": 125,
- "disk_type": {
- "disk_type_id": "9",
- "disk_name": "UltraSSD",
- "cloud_provider_id": "2",
- "has_fixed_size": false,
- "min_size": 4,
- "max_size": 65536,
- "has_fixed_iops": false,
- "min_iops": 3000,
- "max_iops": 400000,
- "has_fixed_throughput": false,
- "min_throughput": 125,
- "max_throughput": 10000
}
}, - "tenant_id": "15a095ee-0fc0-47ec-a44a-b6d9c9455a42",
- "created_by_user_id": "9",
- "modified_by_user_id": "9",
- "last_update_date": "2025-06-11T13:27:04.644Z",
- "from_date": "2025-06-11T13:27:04.644Z",
- "to_date": null
}Get cluster instance logs.
Downloads logs of a cluster instance filtered by ID or name and log type.
You can choose one of the following options to control the output:
- Provide
last_n_linesto fetch only the last N lines of the log. - Provide
file_size_bytesto download only the first N bytes of the log.
If neither option is specified, the complete log file will be downloaded.
Authorizations:
path Parameters
| log_type required | string Enum: "stdout" "stderr" The type of log file to filter. |
query Parameters
| cluster_id | integer <int64> Cluster instance ID used for filtering. |
| cluster_name | string Cluster instance name used for filtering. |
| cluster_status_id | integer <int64> Cluster status ID used for filtering. |
| last_n_lines | integer <int32> [ 1 .. 1000 ] Number of lines to retrieve from the end of the log file (sample preview). |
| file_size_bytes | integer <int64> >= 1 Number of bytes to preview from the beginning of the log file (sample preview). |
Responses
Response samples
- 400
- 401
- 404
- 500
{- "error_code": "string",
- "error_message": "string"
}Get Spark job statistics of a cluster instance.
Retrieve Spark job statistics for a specific cluster instance by its ID or Name.
Authorizations:
query Parameters
| cluster_id | integer <int64> Cluster instance ID used for filtering. |
| cluster_name | string Cluster instance name used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "SUBMITTED": 0,
- "RUNNING": 0,
- "DONE": 511607,
- "ERROR": 0,
- "TERMINATED": 0,
- "STOPPING": 0,
- "STOPPED": 0,
- "TOTAL_JOB_COUNT": 511607
}Get workflow errors of a cluster instance.
Retrieve a list of workflow errors for a specific cluster instance by its ID.
Authorizations:
path Parameters
| cluster_id required | integer <int64> Cluster instance ID used for filtering. |
query Parameters
| cluster_status_id | integer <int64> Cluster status ID used for filtering. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "error": "OnError(Process finish with exit code 137,None,None,None,Some(akka.stream.alpakka.amqp.impl.AmqpSourceStage$$anon$1$$anon$2$$anon$3@47b8c55f))"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 10
}
}Stop all jobs in a cluster instance.
Stops all jobs in the SUBMITTED or RUNNING state for clusters that are in the INITIATING, RUNNING, or ERROR state.
Authorizations:
path Parameters
| cluster_id required | integer <int64> Specifies the ID of the cluster instance to stop all the jobs. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "CosiKillAllJobsByCluster": {
- "workflow_job_id": 0,
- "workflow_job_instance_id": 0,
- "engine_cluster_instance_id": 1,
- "created_by_user_id": 1
}
}Get all filter data for cluster instances.
Retrieves a list of:
- spark infra version being used by cluster instances
- active memory and cores being used by cluster configuration attached with cluster instances
- Users who created the cluster instances.
- Users who modified the cluster instances.
Authorizations:
query Parameters
| enable | boolean Enum: true false Specifies which clusters to list.
Note: If unspecified, all clusters (both active and disabled) will be listed. |
| filter_type required | string Enum: "runtime_version" "active_memory_and_cores" "created_by_user" "modified_by_user" Specifies the filter type for the data to retrieve. Choose one of the following:
|
| name | string Specifies the name to filter by, applicable to the chosen filter type. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "user_id": 1,
- "username": "rp0000@yeedu.io"
}
]
}Get Workspaces for a specific Cluster Id
Retrieve a list of workspaces filtered by a specific Cluster Id
Authorizations:
path Parameters
| cluster_id required | integer <int64> Cluster ID used for filtering. |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "workspace": {
- "workspace_id": 89,
- "name": "namename_2",
- "description": "description"
}, - "cluster_info": {
- "cluster_id": 32,
- "name": "ct2508-test-cluster",
- "cluster_status": "DESTROYED",
- "cluster_type": "YEEDU",
- "instance_size": 0,
- "min_instances": 1,
- "max_instances": 1,
- "is_turbo_enabled": true,
- "is_cuda_enabled": false,
- "cluster_conf": {
- "cluster_conf_id": 55,
- "cluster_conf_name": "c5ad.xlarge",
- "machine_type_category": "compute_optimized",
- "machine_type": {
- "machine_type_id": 55,
- "name": "c5ad.xlarge",
- "vCPUs": 4,
- "memory": "8 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "AMD EPYC 7R32"
], - "cpu_min_frequency_GHz": [
- "2.8"
], - "cpu_max_frequency_GHz": [
- "3.3"
], - "has_local_disk": true,
- "local_disk_size_GB": 150,
- "local_num_of_disks": 1,
- "local_disk_throughput_MB": null,
- "has_spot_instance_support": true,
- "machine_price_ycu": 1.8
}, - "machine_volume_conf": null
}, - "cloud_env": {
- "cloud_env_id": 2,
- "name": "aws_env",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS"
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 5,
- "spark_docker_image_name": "v3.5.3-6",
- "spark_version": "3.5.3",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": false,
- "thrift_support": true,
- "yeedu_functions_support": true,
- "has_turbo_support": true,
- "turbo_version": "v1.0.7",
- "has_unity_support": true,
- "unity_version": "v1.0.7",
- "has_hive_support": true,
- "cuda_rapids_version": "23.04.1"
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 4,
- "num_of_workers": null
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}
}, - "thrift_url": "jdbc:hive2://dev-onprem-004.yeedu.io:8080/default;ssl=true;transportMode=http;httpPath=/api/v1/workspace/89/cluster/32/thrift;reuseSameConn=true;ignoreHttpVerify=true;maxJobStartTimeout=5;",
- "created_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "modified_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "last_update_date": "2025-08-20T08:51:12.776506+00:00",
- "from_date": "2025-08-20T08:51:12.776506+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 3,
- "limit": 1,
- "next_page": 2
}
}Search Workspaces for a specific Cluster Id and Workspace Name
Search workspaces for a specific Cluster Id and Workspace Name, returned in the form of JSON
Authorizations:
path Parameters
| cluster_id required | integer <int64> Cluster Id that will be used for filter |
query Parameters
| workspace_name required | string Workspace Name that will be used for filter |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "workspace": {
- "workspace_id": 89,
- "name": "namename_2",
- "description": "description"
}, - "cluster_info": {
- "cluster_id": 32,
- "name": "ct2508-test-cluster",
- "cluster_status": "DESTROYED",
- "cluster_type": "YEEDU",
- "instance_size": 0,
- "min_instances": 1,
- "max_instances": 1,
- "is_turbo_enabled": true,
- "is_cuda_enabled": false,
- "cluster_conf": {
- "cluster_conf_id": 55,
- "cluster_conf_name": "c5ad.xlarge",
- "machine_type_category": "compute_optimized",
- "machine_type": {
- "machine_type_id": 55,
- "name": "c5ad.xlarge",
- "vCPUs": 4,
- "memory": "8 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "AMD EPYC 7R32"
], - "cpu_min_frequency_GHz": [
- "2.8"
], - "cpu_max_frequency_GHz": [
- "3.3"
], - "has_local_disk": true,
- "local_disk_size_GB": 150,
- "local_num_of_disks": 1,
- "local_disk_throughput_MB": null,
- "has_spot_instance_support": true,
- "machine_price_ycu": 1.8
}, - "machine_volume_conf": null
}, - "cloud_env": {
- "cloud_env_id": 2,
- "name": "aws_env",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS"
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 5,
- "spark_docker_image_name": "v3.5.3-6",
- "spark_version": "3.5.3",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": false,
- "thrift_support": true,
- "yeedu_functions_support": true,
- "has_turbo_support": true,
- "turbo_version": "v1.0.7",
- "has_unity_support": true,
- "unity_version": "v1.0.7",
- "has_hive_support": true,
- "cuda_rapids_version": "23.04.1"
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 4,
- "num_of_workers": null
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}
}, - "thrift_url": "jdbc:hive2://dev-onprem-004.yeedu.io:8080/default;ssl=true;transportMode=http;httpPath=/api/v1/workspace/89/cluster/32/thrift;reuseSameConn=true;ignoreHttpVerify=true;maxJobStartTimeout=5;",
- "created_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "modified_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "last_update_date": "2025-08-20T08:51:12.776506+00:00",
- "from_date": "2025-08-20T08:51:12.776506+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 3,
- "limit": 1,
- "next_page": 2
}
}Get Clusters for a specific Workspace Id
Retrieves clusters mapped to a workspace, with optional filters by job type, cluster status, and enablement.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| cluster_status | Array of strings Items Enum: "INITIATING" "RUNNING" "STOPPING" "STOPPED" "DESTROYING" "DESTROYED" "ERROR" "RESIZING_UP" "RESIZING_DOWN" Specifies the cluster instance statuses to be used as a filter. |
| job_type | string Enum: "SPARK_JOB" "SPARK_SQL" "NOTEBOOK" "THRIFT_SQL" "YEEDU_FUNCTIONS" Specifies the job type to filter and list the supported clusters. |
| enable | boolean Enum: true false Specifies which clusters to list.
Note: If unspecified, all clusters (both active and disabled) will be listed. |
| all | boolean Default: false Enum: true false A boolean that can be set to return all the data. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "workspace": {
- "workspace_id": 89,
- "name": "namename_2",
- "description": "description"
}, - "cluster_info": {
- "cluster_id": 32,
- "name": "ct2508-test-cluster",
- "cluster_status": "DESTROYED",
- "cluster_type": "YEEDU",
- "instance_size": 0,
- "min_instances": 1,
- "max_instances": 1,
- "is_turbo_enabled": true,
- "is_cuda_enabled": false,
- "cluster_conf": {
- "cluster_conf_id": 55,
- "cluster_conf_name": "c5ad.xlarge",
- "machine_type_category": "compute_optimized",
- "machine_type": {
- "machine_type_id": 55,
- "name": "c5ad.xlarge",
- "vCPUs": 4,
- "memory": "8 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "AMD EPYC 7R32"
], - "cpu_min_frequency_GHz": [
- "2.8"
], - "cpu_max_frequency_GHz": [
- "3.3"
], - "has_local_disk": true,
- "local_disk_size_GB": 150,
- "local_num_of_disks": 1,
- "local_disk_throughput_MB": null,
- "has_spot_instance_support": true,
- "machine_price_ycu": 1.8
}, - "machine_volume_conf": null
}, - "cloud_env": {
- "cloud_env_id": 2,
- "name": "aws_env",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS"
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 5,
- "spark_docker_image_name": "v3.5.3-6",
- "spark_version": "3.5.3",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": false,
- "thrift_support": true,
- "yeedu_functions_support": true,
- "has_turbo_support": true,
- "turbo_version": "v1.0.7",
- "has_unity_support": true,
- "unity_version": "v1.0.7",
- "has_hive_support": true,
- "cuda_rapids_version": "23.04.1"
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 4,
- "num_of_workers": null
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}
}, - "thrift_url": "jdbc:hive2://dev-onprem-004.yeedu.io:8080/default;ssl=true;transportMode=http;httpPath=/api/v1/workspace/89/cluster/32/thrift;reuseSameConn=true;ignoreHttpVerify=true;maxJobStartTimeout=5;",
- "created_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "modified_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "last_update_date": "2025-08-20T08:51:12.776506+00:00",
- "from_date": "2025-08-20T08:51:12.776506+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 3,
- "limit": 1,
- "next_page": 2
}
}Search clusters by cluster name in a specific workspace
Clusters are filtered based on a specific cluster name within a Workspace ID, and the results are returned as a JSON
Authorizations:
path Parameters
| workspace_id required | integer <int64> Workspace Id that will be used for filter |
query Parameters
| cluster_name required | string Cluster Name that will be used for filter |
| enable | boolean Enum: true false Specifies which clusters to list.
Note: If unspecified, all clusters (both active and disabled) will be listed. |
| cluster_status | Array of strings Items Enum: "INITIATING" "RUNNING" "STOPPING" "STOPPED" "DESTROYING" "DESTROYED" "ERROR" "RESIZING_UP" "RESIZING_DOWN" Specifies the cluster instance statuses to be used as a filter. |
| job_type | string Enum: "SPARK_JOB" "SPARK_SQL" "NOTEBOOK" "THRIFT_SQL" "YEEDU_FUNCTIONS" Specifies the job type to filter and search the supported clusters. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "workspace": {
- "workspace_id": 89,
- "name": "namename_2",
- "description": "description"
}, - "cluster_info": {
- "cluster_id": 32,
- "name": "ct2508-test-cluster",
- "cluster_status": "DESTROYED",
- "cluster_type": "YEEDU",
- "instance_size": 0,
- "min_instances": 1,
- "max_instances": 1,
- "is_turbo_enabled": true,
- "is_cuda_enabled": false,
- "cluster_conf": {
- "cluster_conf_id": 55,
- "cluster_conf_name": "c5ad.xlarge",
- "machine_type_category": "compute_optimized",
- "machine_type": {
- "machine_type_id": 55,
- "name": "c5ad.xlarge",
- "vCPUs": 4,
- "memory": "8 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "AMD EPYC 7R32"
], - "cpu_min_frequency_GHz": [
- "2.8"
], - "cpu_max_frequency_GHz": [
- "3.3"
], - "has_local_disk": true,
- "local_disk_size_GB": 150,
- "local_num_of_disks": 1,
- "local_disk_throughput_MB": null,
- "has_spot_instance_support": true,
- "machine_price_ycu": 1.8
}, - "machine_volume_conf": null
}, - "cloud_env": {
- "cloud_env_id": 2,
- "name": "aws_env",
- "cloud_provider": {
- "cloud_provider_id": 1,
- "name": "AWS"
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 5,
- "spark_docker_image_name": "v3.5.3-6",
- "spark_version": "3.5.3",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": false,
- "thrift_support": true,
- "yeedu_functions_support": true,
- "has_turbo_support": true,
- "turbo_version": "v1.0.7",
- "has_unity_support": true,
- "unity_version": "v1.0.7",
- "has_hive_support": true,
- "cuda_rapids_version": "23.04.1"
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 4,
- "num_of_workers": null
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}
}, - "thrift_url": "jdbc:hive2://dev-onprem-004.yeedu.io:8080/default;ssl=true;transportMode=http;httpPath=/api/v1/workspace/89/cluster/32/thrift;reuseSameConn=true;ignoreHttpVerify=true;maxJobStartTimeout=5;",
- "created_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "modified_by": {
- "user_id": 1,
- "username": "YSU0000"
}, - "last_update_date": "2025-08-20T08:51:12.776506+00:00",
- "from_date": "2025-08-20T08:51:12.776506+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 3,
- "limit": 1,
- "next_page": 2
}
}Map a Cluster with a Workspace
Creates a mapping between a cluster and a workspace.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Workspace Id that will be used for filter |
| cluster_id required | integer <int64> Cluster Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "workspace_id": "5",
- "cluster_id": "13",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T11:20:59.259Z",
- "from_date": "2024-06-21T11:20:59.259Z",
- "to_date": null
}Delete an existing Cluster Permission for a Group
Deletes the mapping between a cluster and a workspace.
Authorizations:
path Parameters
| cluster_id required | integer <int64> Cluster Id that will be used for filter |
| workspace_id required | integer <int64> Workspace Id that will be used for filter |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted Permission Successfully."
}Retrieve all accessible workspaces for the user.
All workspaces within the tenant will be listed for users with either a
Platform AdminorAdminrole.For users with a
Can Manage ClusterorUserrole, only the workspaces where they have at least one workspace permission will be listed.
Authorizations:
query Parameters
| enable | boolean Enum: true false Specifies which workspace to list.
Note: If unspecified, all workspaces (both active and disabled) will be listed. |
| cluster_id | integer <int64> Filter workspaces by |
| is_attached | boolean Enum: true false Filter workspaces by cluster attachment status. Requires both |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "workspace_id": 5,
- "name": "test",
- "job_count": 4,
- "notebook_count": 2,
- "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "last_update_date": "2024-06-20T17:20:46.149683+00:00",
- "from_date": "2024-06-20T17:20:46.149683+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 1
}
}Search workspaces based on workspace name.
All workspaces within the tenant can be searched by users with either a
Platform AdminorAdminrole.For users with a
Can Manage ClusterorUserrole, only the workspaces where they have at least one workspace permission can be searched.
Authorizations:
query Parameters
| workspace_name required | string Specifies the name of the workspace to search. |
| enable | boolean Enum: true false Specifies which workspace to search.
Note: If unspecified, all workspaces (both active and disabled) will be searched. |
| cluster_id | integer <int64> Filter workspaces by |
| is_attached | boolean Enum: true false Filter workspaces by cluster attachment status. Requires both |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "workspace_id": 5,
- "name": "test",
- "job_count": 4,
- "notebook_count": 2,
- "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "last_update_date": "2024-06-20T17:20:46.149683+00:00",
- "from_date": "2024-06-20T17:20:46.149683+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 1
}
}Create a new Workspace
This API creates a new workspace in the tenant using the workspace details provided. The workspace will be available for users once created.
Authorizations:
Request Body schema: application/jsonrequired
The Workspace details to be added
| name required | string non-empty The
|
| description | string or null non-empty |
Responses
Request samples
- Payload
{- "name": "test_workspace",
- "description": "Workspace for test environment"
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "workspace_id": "6",
- "name": "spark_jobs_test",
- "description": "Test Spark Jobs",
- "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T11:29:36.685Z",
- "from_date": "2024-06-21T11:29:36.685Z",
- "to_date": null
}Get details of a specific workspace.
Workspace details within the tenant can be retrieved by users with either a
Platform AdminorAdminrole.For users with a
Can Manage ClusterorUserrole, only the workspace details where they have at least one workspace permission can be retrieved.
Authorizations:
query Parameters
| workspace_id | integer <int64> Specifies the ID of the workspace to retrieve details. |
| workspace_name | string Specifies the name of the workspace to retrieve details. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "workspace_id": 6,
- "name": "spark_jobs_test",
- "description": "Test Spark Jobs",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "last_update_date": "2024-06-21T11:29:36.685681+00:00",
- "from_date": "2024-06-21T11:29:36.685681+00:00",
- "to_date": "infinity"
}Update details of a specific workspace.
All workspaces within the tenant can be modified by users with either a
Platform AdminorAdminrole.For users with a
Can Manage ClusterorUserrole, only the workspaces where they have at least one workspace permission can be modified.
Authorizations:
query Parameters
| workspace_id | integer <int64> Specifies the ID of the workspace to update details. |
| workspace_name | string Specifies the name of the workspace to update details. |
Request Body schema: application/jsonrequired
The Workspace details to be modified
| name | string non-empty The
|
| description | string or null non-empty |
Responses
Request samples
- Payload
{- "name": "spark_jobs_test",
- "description": "Test Spark Curation Jobs"
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "workspace_id": "6",
- "name": "spark_jobs_test",
- "description": "Test Spark Jobs",
- "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T11:29:36.685Z",
- "from_date": "2024-06-21T11:29:36.685Z",
- "to_date": null
}Enable a specific workspace.
All workspaces within the tenant can be enabled by users with either a
Platform AdminorAdminrole.For users with a
Can Manage ClusterorUserrole, only the workspaces where they have at least one workspace permission can be enabled.
Authorizations:
query Parameters
| workspace_id | integer <int64> Workspace Id that will be used for filter |
| workspace_name | string Workspace Name that will be used for filter |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "workspace_id": "6",
- "name": "spark_jobs_test",
- "description": "Test Spark Jobs",
- "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T11:29:36.685Z",
- "from_date": "2024-06-21T11:29:36.685Z",
- "to_date": null
}Disable a specific workspace.
All workspaces within the tenant can be disabled by users with either a
Platform AdminorAdminrole.For users with a
Can Manage ClusterorUserrole, only the workspaces where they have at least one workspace permission can be disabled.
Authorizations:
query Parameters
| workspace_id | integer <int64> Workspace Id that will be used for filter |
| workspace_name | string Workspace Name that will be used for filter |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "workspace_id": "6",
- "name": "spark_jobs_test",
- "description": "Test Spark Curation Jobs",
- "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T11:35:38.174Z",
- "from_date": "2024-06-21T11:29:36.685Z",
- "to_date": "2024-06-21T11:35:38.174Z"
}Get Spark job statistics of a workspace.
Retrieve Spark job statistics for a specific workspace by its ID or Name.
Workspace statistics within the tenant can be retrieved by users with either a
Platform AdminorAdminrole.For users with a
Can Manage ClusterorUserrole, only the workspace statistics where they have at least one workspace permission can be retrieved.
Authorizations:
query Parameters
| workspace_id | integer <int64> Specifies the ID of the workspace to retrieve stats. |
| workspace_name | string Specifies the name of the workspace to retrieve stats. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "SUBMITTED": 0,
- "RUNNING": 0,
- "DONE": 511607,
- "ERROR": 0,
- "TERMINATED": 0,
- "STOPPING": 0,
- "STOPPED": 0,
- "TOTAL_JOB_COUNT": 511607
}Export job and notebooks of a specified workspace.
Export job and notebooks of a specified workspace.
- The user must have at least one permission in the workspace from which the jobs or notebooks are being exported.
Note- If the workspace consists of Notebooks, then the workspace will be exported as a zip file with the notebook files and the job and notebooks, else for jobs it will be exported as a file.
Authorizations:
query Parameters
| workspace_id | integer <int64> Specifies the ID of the workspace to export job and notebooks from. |
| workspace_name | string Specifies the name of the workspace to export job and notebooks from. |
| enable | boolean Enum: true false Specifies which job and notebooks to export.
Note: If unspecified, all configurations (both active and disabled) will be exported. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "jobs": [
- {
- "name": "spark_examples",
- "spark_job_type": {
- "job_type": "SPARK_JOB",
- "language": "Jar"
}, - "cluster_info": null,
- "max_concurrency": 0,
- "files": null,
- "properties_file": null,
- "conf": null,
- "packages": null,
- "repositories": null,
- "jars": null,
- "archives": null,
- "driver_memory": null,
- "driver_java_options": "-Dderby.system.home=/yeedu/spark_metastores/1721804846-27730",
- "driver_library_path": null,
- "driver_class_path": null,
- "executor_memory": null,
- "principal": null,
- "keytab": null,
- "queue": null,
- "job_class_name": "org.apache.spark.examples.SparkPi",
- "job_command": "file:///yeedu/object-storage-manager/spark-examples_2.11-2.4.8.jar",
- "job_arguments": "250",
- "job_rawScalaCode": null,
- "job_timeout_min": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false
}
], - "notebooks": [
- {
- "name": "test_notebook",
- "spark_job_type": {
- "job_type": "NOTEBOOK",
- "language": "Python3"
}, - "cluster_info": null,
- "notebook_file_id": "1",
- "conf": null,
- "packages": null,
- "jars": null,
- "files": null,
- "driver_memory": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false
}
]
}Import job and notebooks into a specified workspace.
Import job and notebooks into a specified workspace.
The user must have either the
MANAGEorEDITpermission in the workspace where the job or notebook will be imported.The cluster specified in the parameter takes precedence over the cluster specified in the import JSON, if present.
The Spark parameters (
driver_memory,driver_cores,executor_memory,executor_cores,num_executors,total_executor_cores) are ignored if the cluster types do not match:Between the cluster provided in the query parameter and the cluster specified in the import JSON, provided both clusters are present and not destroyed.
If the cluster specified in the import JSON has a different type from the cluster with the same name or if the cluster has been destroyed.
The cluster is not attached to the job or notebook if:
The cluster provided in the query parameter is not found or has been destroyed.
No cluster is provided in the query parameter, and:
The cluster specified in the import JSON is not found or has been destroyed.
The cluster information in the import JSON is null.
The workspace provided must have access to use the cluster found according to the above rules.
The job or notebook is configuration overwritten if the
overwriteparameter is set to true and an existing job or notebook with the same name is found.By default,
max_concurrencyfor a notebook will always be 1.The behavior of the import process depends on the
permissiveparameter:If set to
true:Any error encountered while importing either a job or a notebook is ignored.
The remaining jobs or notebooks are imported successfully despite the error.
If set to
false(default):Any error encountered while importing either a job or a notebook results in a rollback.
None of the jobs or notebooks are imported.
To import exported notebooks, follow these steps:
Extract the ZIP file:
- Unzip the exported folder.
Upload the Notebooks Folder (if applicable):
- If the unzipped folder contains a
notebooksdirectory, upload it to the target workspace where the notebooks should be imported.
- If the unzipped folder contains a
Import Using the
.yeeduFile:- Utilize the JSON content inside the
name.yeedufile to import the notebook(s).
- Utilize the JSON content inside the
Handling Non-Zipped Exports (A Warning Message will be given in the API response in this case):
If the exported notebook(s) were not provided in a ZIP file, directly use the JSON inside the
name.yeedufile for import. No file or folder needs to be uploaded to the workspace.In this case, no actual notebook files are associated with the imported notebooks, only their configurations are imported.
When the notebook is opened via the UI, a new notebook file will be automatically created based on the imported configuration.
Authorizations:
query Parameters
| workspace_id | integer <int64> Specifies the ID of the workspace for importing job and notebooks. |
| workspace_name | string Specifies the name of the workspace for importing job and notebooks. |
| cluster_id | integer <int64> Specifies the cluster ID to which all imported job and notebooks will be attached. |
| cluster_name | string Specifies the cluster name to which all imported job and notebooks will be attached. |
| permissive | boolean Default: false Enum: true false Specifies if the import should be permissive, allowing for partial imports when encountering errors. |
| overwrite | boolean Default: false Enum: true false Specifies whether to overwrite existing configurations with the same job or notebook name. |
Request Body schema: application/jsonrequired
Array of objects or null (ImportSparkJobConfig) | |
Array of objects or null (ImportNotebookConfig) |
Responses
Request samples
- Payload
{- "jobs": [
- {
- "name": "string",
- "spark_job_type": {
- "job_type": "SPARK_JOB",
- "language": "RAW_SCALA"
}, - "cluster_info": {
- "cluster_name": "string",
- "cluster_type": "YEEDU"
}, - "max_concurrency": 0,
- "files": [
- "string"
], - "properties_file": [
- "string"
], - "conf": [
- "string"
], - "packages": [
- "string"
], - "repositories": [
- "string"
], - "jars": [
- "string"
], - "archives": [
- "string"
], - "driver_memory": "string",
- "driver_java_options": "string",
- "driver_library_path": "string",
- "driver_class_path": "string",
- "executor_memory": "string",
- "principal": "string",
- "keytab": "string",
- "queue": "string",
- "job_class_name": "string",
- "job_command": "string",
- "job_arguments": "string",
- "job_rawScalaCode": "string",
- "job_timeout_min": 1,
- "extra_info": { },
- "driver_cores": 1,
- "total_executor_cores": 1,
- "executor_cores": 1,
- "num_executors": 1,
- "should_append_params": false,
- "yeedu_functions_project_path": "string",
- "yeedu_functions_script_path": "string",
- "yeedu_functions_function_name": "string",
- "yeedu_functions_requirements": "string",
- "yeedu_functions_max_request_concurrency": 1,
- "yeedu_functions_idle_timeout_sec": 300,
- "yeedu_functions_request_timeout_sec": 1,
- "yeedu_functions_example_request_body": "string"
}
], - "notebooks": [
- {
- "name": "string",
- "spark_job_type": {
- "job_type": "NOTEBOOK",
- "language": "Python3"
}, - "cluster_info": {
- "cluster_name": "string",
- "cluster_type": "YEEDU"
}, - "notebook_file_path": "string",
- "conf": [
- "string"
], - "packages": [
- "string"
], - "jars": [
- "string"
], - "files": [
- "string"
], - "driver_memory": "string",
- "executor_memory": "string",
- "driver_cores": 1,
- "total_executor_cores": 1,
- "executor_cores": 1,
- "num_executors": 1,
- "should_append_params": false
}
]
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "message": "string",
- "job_details": {
- "successful_imports": [
- "string"
], - "failed_imports": [
- "string"
]
}, - "notebook_details": {
- "successful_imports": [
- "string"
], - "failed_imports": [
- "string"
]
}
}Get all the workspace files.
This API returns a list of files and directories within a specific workspace, identified by workspace ID or name. You can filter by file or directory, by file path or ID, and choose whether to list contents recursively. Pagination parameters control how many results are returned per request.
Authorizations:
query Parameters
| workspace_id | integer <int64> workspace ID used for filtering. |
| workspace_name | string workspace name used for filtering. |
| file_id | integer <int64> workspace file ID used for filtering. |
| file_path | string workspace file path used for filtering. |
| is_dir | boolean or null Enum: true false is_dir parameter used for filtering directories or files. |
| recursive | boolean Default: false Enum: true false Boolean flag to indicate whether to list files recursively. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "workspace_file_id": 9,
- "workspace": {
- "workspace_id": 1,
- "workspace_name": "test",
- "description": null
}, - "file_name": "file5.txt",
- "full_file_path": "file:///files/file5.txt",
- "file_size_bytes": "0",
- "file_type": "txt",
- "is_dir": false,
- "parent_id": null,
- "tenant_id": "eb535243-3747-40ae-b75b-bbaef9bc944b",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-01-02T09:37:00.82327+00:00",
- "from_date": "2025-01-02T09:37:00.82327+00:00",
- "to_date": "infinity"
}, - {
- "workspace_file_id": 1,
- "workspace": {
- "workspace_id": 1,
- "workspace_name": "test",
- "description": null
}, - "file_name": "parent",
- "full_file_path": "file:///files/parent",
- "file_size_bytes": null,
- "file_type": null,
- "is_dir": true,
- "parent_id": null,
- "tenant_id": "eb535243-3747-40ae-b75b-bbaef9bc944b",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-01-02T07:49:13.983084+00:00",
- "from_date": "2025-01-02T07:49:13.983084+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 1,
- "limit": 100
}
}Upload file to an workspace.
Upload a file to a specified Workspace ID or name.
- overwrite: A boolean parameter indicating whether the file should overwrite any existing file in the workspace. The default value is false.
- is_dir: A boolean parameter indicating whether to create a directory or upload a file.
- path: The path used to specify the file location when uploading a directory. It indicates the directory structure where the file will be stored.
- target_dir: The target directory, used as a prefix, for the destination when uploading a directory or file to the workspace.
Authorizations:
query Parameters
| workspace_id | integer <int64> Workspace ID used for filtering. |
| workspace_name | string Workspace name used for filtering. |
| overwrite | boolean Default: false Enum: true false Boolean flag to indicate whether to overwrite existing files. |
| is_dir | boolean Default: false Enum: true false A boolean flag indicating whether the uploaded file is part of a directory upload. |
| path required | string non-empty The path used to specify the file location when uploading a directory. |
| target_dir | string The target directory, used as a prefix, for the destination when uploading a directory or file. |
header Parameters
| x-file-size | number The header for file size |
Request Body schema: optional
File to upload to an workspace.
Metadata or control object.
Responses
Request samples
- Payload
{ }Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "workspace_file_id": "13",
- "file_name": "sub-file1.txt",
- "full_file_path": "file:///files/parent/sub-parent-1/sub-file1.txt",
- "file_size_bytes": "0",
- "file_type": "txt",
- "is_dir": false,
- "parent_id": "4",
- "tenant_id": "eb535243-3747-40ae-b75b-bbaef9bc944b",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2025-01-02T16:28:20.201Z",
- "from_date": "2025-01-02T11:22:31.076Z",
- "to_date": null
}Search workspace files based on file name.
This API searches for files or directories in a workspace by their name. You can filter by workspace ID or name, file ID, file path, and whether to search only files or directories. It also supports recursive search and pagination for large results.
Authorizations:
query Parameters
| file_name required | string workspace file name used for filtering. |
| workspace_id | integer <int64> Workspace ID used for filtering. |
| workspace_name | string Workspace name used for filtering. |
| file_id | integer <int64> workspace file ID used for filtering. |
| file_path | string workspace file path used for filtering. |
| is_dir | boolean or null Enum: true false is_dir parameter used for filtering directories or files. |
| recursive | boolean Default: false Enum: true false Boolean flag to indicate whether to search files recursively. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "workspace_file_id": 9,
- "workspace": {
- "workspace_id": 1,
- "workspace_name": "test",
- "description": null
}, - "file_name": "file5.txt",
- "full_file_path": "file:///files/file5.txt",
- "file_size_bytes": "0",
- "file_type": "txt",
- "is_dir": false,
- "parent_id": null,
- "tenant_id": "eb535243-3747-40ae-b75b-bbaef9bc944b",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-01-02T09:37:00.82327+00:00",
- "from_date": "2025-01-02T09:37:00.82327+00:00",
- "to_date": "infinity"
}, - {
- "workspace_file_id": 1,
- "workspace": {
- "workspace_id": 1,
- "workspace_name": "test",
- "description": null
}, - "file_name": "parent",
- "full_file_path": "file:///files/parent",
- "file_size_bytes": null,
- "file_type": null,
- "is_dir": true,
- "parent_id": null,
- "tenant_id": "eb535243-3747-40ae-b75b-bbaef9bc944b",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-01-02T07:49:13.983084+00:00",
- "from_date": "2025-01-02T07:49:13.983084+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 1,
- "limit": 100
}
}Get details of a specific workspace file or a directory.
This API fetches detailed metadata about a single file or directory in a workspace, identified by workspace ID/name and file ID or path.
Authorizations:
query Parameters
| workspace_id | integer <int64> Workspace ID used for filtering. |
| workspace_name | string Workspace name used for filtering. |
| file_id | integer <int64> workspace file ID used for filtering. |
| file_path | string workspace file path used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "workspace_file_id": 9,
- "workspace": {
- "workspace_id": 1,
- "workspace_name": "test",
- "description": null
}, - "file_name": "file5.txt",
- "full_file_path": "file:///files/file5.txt",
- "file_size_bytes": "0",
- "file_type": "txt",
- "is_dir": false,
- "parent_id": null,
- "tenant_id": "eb535243-3747-40ae-b75b-bbaef9bc944b",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-01-02T09:37:00.82327+00:00",
- "from_date": "2025-01-02T09:37:00.82327+00:00",
- "to_date": "infinity"
}Delete a specific workspace file.
Delete an workspace file filtered by its ID or name, along with the file ID or file path.
Authorizations:
query Parameters
| workspace_id | integer <int64> Workspace ID used for filtering. |
| workspace_name | string Workspace name used for filtering. |
| file_id | integer <int64> workspace file ID used for deletion. |
| file_path | string workspace file path used for deletion. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "The File '/yeedu/files/8cee6100-7086-4138-92fd-712046174e91/1/spark-sql_2.12-3.2.0.jar' has been deleted."
}Download workspace files or directories.
This API allows downloading the content of a workspace file or directory specified by workspace ID/name and file ID or path.
Authorizations:
query Parameters
| workspace_id | integer <int64> Workspace ID used for filtering. |
| workspace_name | string Workspace name used for filtering. |
| file_id | integer <int64> workspace file ID used for filtering. |
| file_path | string workspace file path used for filtering. |
Responses
Response samples
- 400
- 401
- 404
- 500
{- "error_code": "string",
- "error_message": "string"
}Rename workspace file or directory.
This API renames a workspace file or directory identified by workspace ID/name and file ID or path.
Authorizations:
query Parameters
| workspace_id | integer <int64> Workspace ID used for filtering. |
| workspace_name | string Workspace name used for filtering. |
| file_id | integer <int64> workspace file ID used for renaming. |
| file_path | string non-empty workspace file path used for filtering. |
| file_name required | string [ 1 .. 1000 ] characters new name for the file/directory |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "message": "Renamed /home/user/documents/report.csv to /workspace/shared/report.csv in Workspace Id: 123"
}Move workspace file or directory from one path to another path.
This API moves a file or directory from one path to another within the same workspace. The source file is identified by its ID or path, and the destination path is required. You can specify whether to overwrite an existing file at the destination.
Authorizations:
query Parameters
| workspace_id | integer <int64> Workspace ID used for filtering. |
| workspace_name | string Workspace name used for filtering. |
| source_file_id | integer <int64> workspace file ID of source file/directory used for moving. |
| source_file_path | string non-empty workspace file path of source file/directory used for moving. |
| destination_file_path required | string [ 1 .. 1000 ] characters workspace file path of destination file/directory used for moving |
| overwrite | boolean Default: false Enum: true false Boolean flag to indicate whether to overwrite existing files. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 409
- 500
{- "message": "Moved /home/user/documents/report.csv to /workspace/shared/report.csv in Workspace Id: 123"
}Copy workspace file or directory from one path to another path.
This API copies a file or directory from a source path to a destination path within the workspace. It supports overwrite options if the destination file already exists.
Authorizations:
query Parameters
| workspace_id | integer <int64> Workspace ID used for filtering. |
| workspace_name | string Workspace name used for filtering. |
| source_file_id | integer <int64> workspace file ID of source file/directory used for copy. |
| source_file_path | string non-empty workspace file path of source file/directory used for copy. |
| destination_file_path required | string [ 1 .. 1000 ] characters workspace file path of destination file/directory used for copy. |
| overwrite | boolean Default: false Enum: true false Boolean flag to indicate whether to overwrite existing files. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 409
- 500
{- "message": "Copied /home/user/documents/report.csv to /workspace/shared/report.csv in Workspace Id: 123"
}Get workspace files usage details of a specific Workspace.
Retrieve workspace files usage details of an workspace filtered by its ID or name.
- workspace_files_maxiumum_upload_limit: Maximum number of files that can be uploaded per workspace is 10000.
Authorizations:
query Parameters
| workspace_id | integer <int64> Workspace ID used for filtering. |
| workspace_name | string Workspace name used for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "workspace_id": 1,
- "workspace_name": "test",
- "workspace_files_usage_in_bytes": 4675074494.19,
- "workspace_files_available_usage_in_bytes": 49012016705.81,
- "workspace_files_maximum_allowed_usage_in_bytes": 53687091200,
- "workspace_files_uploaded": 1736,
- "workspace_files_maximum_upload_limit": 10000
}Get all the workspace secrets.
This API returns all secrets stored in a specified workspace, identified by workspace ID or name. Users can filter secrets by type or specific secret ID. Results support pagination to handle large numbers of secrets.
Authorizations:
query Parameters
| workspace_id | integer <int64> Specifies the ID of the workspace for filtering. |
| workspace_name | string Specifies the name of the workspace for filtering. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
| secret_type | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of secret to filter the secrets. |
| workspace_secret_id | string Filter secrets by a specific workspace secret ID. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "workspace_secret_id": 4,
- "description": null,
- "name": "secretKeyName",
- "secret_type": "ENVIRONMENT VARIABLE",
- "workspace": {
- "workspace_id": 2,
- "name": "test-workspace",
- "description": null
}, - "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-03-04T10:41:41.907762+00:00",
- "from_date": "2025-03-04T10:41:41.907762+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 2,
- "limit": 1,
- "next_page": 2
}
}Create a new workspace secret.
Creates a new secret for the specified workspace.
- Users with Platform Admin role, Admin role within a tenant, or with MANAGE permission in a workspace, are authorized to create workspace secrets for a specified workspace_id.
- The name of the secret should always be unique.
Authorizations:
query Parameters
| workspace_id | integer <int64> Specifies the ID of the workspace for filtering. |
| workspace_name | string Specifies the name of the workspace for filtering. |
Request Body schema: application/jsonrequired
Workspace secret to be created.
| secret_type required | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of authentication secret. |
| name required | string non-empty Secret identifier. |
| description | string or null non-empty Optional secret details. |
| principal required | string non-empty Kerberos principal. |
| keytab required | string non-empty Keytab file path. |
Responses
Request samples
- Payload
{- "secret_type": "AWS ACCESS SECRET KEY PAIR",
- "name": "aws_secret",
- "description": "AWS credentials for accessing S3",
- "type": "string",
- "project_id": "string",
- "private_key_id": "string",
- "private_key": "string",
- "client_email": "string",
- "client_id": "string",
- "auth_uri": "string",
- "token_uri": "string",
- "auth_provider_x509_cert_url": "string",
- "client_x509_cert_url": "string",
- "access_key": "string",
- "secret_key": "string",
- "universe_domain": "string",
- "aws_access_key_id": "AKIAXXXXXXX",
- "aws_secret_access_key": "abcd1234XXXX",
- "aws_default_region": "us-west-2"
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "workspace_secret_id": "1",
- "description": "Neo4j crdentials",
- "workspace_id": "1",
- "name": "key",
- "secret_type": "ENVIRONMENT VARIABLE",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2025-03-04T06:42:27.002Z",
- "from_date": "2025-03-04T06:42:27.002Z",
- "to_date": null
}Update an existing workspace secret.
Updates the specified secret for the specified workspace.
- Users with Platform Admin role, Admin role within a tenant, or with MANAGE permission in a workspace, are authorized to update workspace secrets for a specified workspace_id.
- The name of the secret should always be unique.
Authorizations:
query Parameters
| workspace_id | integer <int64> Specifies the ID of the workspace for filtering. |
| workspace_name | string Specifies the name of the workspace for filtering. |
| workspace_secret_id required | integer <int64> The ID of the workspace secret to be updated. |
Request Body schema: application/jsonrequired
Workspace secret details to be updated.
| secret_type | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of authentication secret. |
| description | string or null non-empty Optional secret details. |
| principal | string non-empty Kerberos principal. |
| keytab | string non-empty Keytab file path. |
Responses
Request samples
- Payload
{- "secret_type": "AWS ACCESS SECRET KEY PAIR",
- "description": "AWS credentials for accessing S3",
- "type": "string",
- "project_id": "string",
- "private_key_id": "string",
- "private_key": "string",
- "client_email": "string",
- "client_id": "string",
- "auth_uri": "string",
- "token_uri": "string",
- "auth_provider_x509_cert_url": "string",
- "client_x509_cert_url": "string",
- "access_key": "string",
- "secret_key": "string",
- "universe_domain": "string",
- "aws_access_key_id": "AKIAXXXXXXX",
- "aws_secret_access_key": "abcd1234XXXX",
- "aws_default_region": "us-west-2"
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "workspace_secret_id": "1",
- "description": "Neo4j crdentials",
- "workspace_id": "1",
- "name": "key",
- "secret_type": "ENVIRONMENT VARIABLE",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2025-03-04T06:42:27.002Z",
- "from_date": "2025-03-04T06:42:27.002Z",
- "to_date": null
}Delete an existing workspace secret.
- Deletes the specified secret for the specified workspace.
- Users with Platform Admin role, Admin role within a tenant, or with MANAGE permission in a workspace, are authorized to delete workspace secrets for a specified workspace_id.
Authorizations:
query Parameters
| workspace_id | integer <int64> Specifies the ID of the workspace for filtering. |
| workspace_name | string Specifies the name of the workspace for filtering. |
| workspace_secret_id required | integer <int64> The ID of the workspace secret to be deleted. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted workspace secret id: 1."
}Search all the workspace secrets of the specified workspace based on secret name.
This API searches workspace secrets based on their name within a specified workspace. Users can filter by workspace ID or name and secret type. Pagination is supported for large result sets.
Authorizations:
query Parameters
| secret_name required | string Secret name that will be used for filter |
| workspace_id | integer <int64> Specifies the ID of the workspace for filtering. |
| workspace_name | string Specifies the name of the workspace for filtering. |
| secret_type | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of secret to filter the secrets. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "workspace_secret_id": 4,
- "description": null,
- "name": "secretKeyName",
- "secret_type": "ENVIRONMENT VARIABLE",
- "workspace": {
- "workspace_id": 2,
- "name": "test-workspace",
- "description": null
}, - "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-03-04T10:41:41.907762+00:00",
- "from_date": "2025-03-04T10:41:41.907762+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 2,
- "limit": 1,
- "next_page": 2
}
}Get all users having a specific permission in a workspace.
Users having a specific permission in a workspace are listed by workspace ID.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace to retrieve users from. |
query Parameters
| permission_id | integer <int64> Specifies the ID of the permission to retrieve users. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "workspace_id": 1,
- "workspace_name": "test-workspace",
- "description": null,
- "users": [
- {
- "user_id": 12,
- "username": "ru0001-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": null,
- "permission": {
- "permission_id": 1,
- "name": "RUN",
- "description": "To list and run the jobs within a workspace"
}
}
]
}, - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Search all users having a specific permission in a workspace.
Users having a specific permission in a workspace are searched by username.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace to search users from. |
query Parameters
| username required | string Specifies username to retrieve users. |
| permission_id | integer <int64> Specifies the ID of the permission to retrieve users. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "workspace_id": 1,
- "workspace_name": "test-workspace",
- "description": null,
- "users": [
- {
- "user_id": 12,
- "username": "ru0001-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": null,
- "permission": {
- "permission_id": 1,
- "name": "RUN",
- "description": "To list and run the jobs within a workspace"
}
}
]
}, - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Get users for a specific Workspace ID.
List users who have no permissions in a workspace.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "workspace_id": 1,
- "workspace_name": "test-workspace",
- "description": null,
- "permission": {
- "permission_id": 1,
- "name": "RUN",
- "description": "To list and run the jobs within a workspace",
- "users": [
- {
- "user_id": 12,
- "username": "ru0001-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": null
}
]
}
}, - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Search users by Workspace ID and Username.
Search for users by username who have no permissions in a workspace.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| username required | string Specifies the username for searching. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "workspace_id": 1,
- "workspace_name": "test-workspace",
- "description": null,
- "permission": {
- "permission_id": 1,
- "name": "RUN",
- "description": "To list and run the jobs within a workspace",
- "users": [
- {
- "user_id": 12,
- "username": "ru0001-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": null
}
]
}
}, - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Match username for a specific Workspace ID.
Match username for a specific Workspace ID.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| username required | string Specifies the username to get an exact match. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "user_id": 12,
- "username": "ru0001-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": "ru0001-yeedu@yeedu.io",
- "from_date": "2024-06-20T10:38:48.746144+00:00",
- "to_date": "infinity"
}Get user's workspace permission.
Retrieve workspace permission of a user by a specific Workspace ID and User ID.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| user_id required | integer <int64> Specifies the ID of the user for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "user_id": "12",
- "username": "ru0001-yeedu@yeedu.io",
- "workspace": {
- "workspace_id": "1",
- "name": "test-workspace",
- "description": null
}, - "tenant_id": "9d6d3054-a5f6-4dbf-86f9-26989eb73ed3",
- "user_permission": {
- "auth_workspace_user_id": 17,
- "permission": {
- "auth_workspace_perm_id": 1,
- "name": "RUN",
- "description": "To list and run the jobs within a workspace"
}, - "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-20T11:17:23.25889+00:00",
- "from_date": "2024-06-20T11:17:23.25889+00:00",
- "to_date": "infinity"
}, - "group_permission": null
}Get all groups having a specific permission in a workspace.
Groups having a specific permission in a workspace are listed by workspace ID and permission ID.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace to retrieve groups from. |
query Parameters
| permission_id | integer <int64> Specifies the ID of the permission to retrieve groups. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "workspace_id": 1,
- "workspace_name": "test_workspace",
- "description": null,
- "groups": [
- {
- "group_id": 75,
- "group_name": "Yeedu",
- "group_mail": null,
- "group_object_id": "98b270e4-42fd-4969-8884-25a2e8384e53",
- "group_type": null,
- "permission": {
- "permission_id": 3,
- "name": "MANAGE",
- "description": "To list jobs, run jobs, edit Spark jobs, and manage workspace access by adding or removing permissions"
}
}
]
}, - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Search groups having a specific permission in a workspace.
Groups having a specific permission in a workspace are searched by groupname.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace to search groups from. |
query Parameters
| groupname required | string Specifies groupname to retrieve groups. |
| permission_id | integer <int64> Specifies the ID of the permission to retrieve groups. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "workspace_id": 1,
- "workspace_name": "test_workspace",
- "description": null,
- "groups": [
- {
- "group_id": 75,
- "group_name": "Yeedu",
- "group_mail": null,
- "group_object_id": "98b270e4-42fd-4969-8884-25a2e8384e53",
- "group_type": null,
- "permission": {
- "permission_id": 3,
- "name": "MANAGE",
- "description": "To list jobs, run jobs, edit Spark jobs, and manage workspace access by adding or removing permissions"
}
}
]
}, - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Get groups for a specific Workspace ID.
List groups who have no permissions in a workspace.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "workspace_id": 3,
- "workspace_name": "stats_test_workspace",
- "description": null,
- "permission": {
- "permission_id": 3,
- "name": "MANAGE",
- "description": "To list jobs, run jobs, edit Spark jobs, and manage workspace access by adding or removing permissions",
- "groups": [
- {
- "group_id": 75,
- "group_name": "G_Yeedu_Analyst",
- "group_mail": null,
- "group_object_id": "98b270e4-42fd-4969-8884-25a2e8384e53",
- "group_type": null
}
]
}
}, - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Search groups by Workspace ID and Group name.
Search for groups by groupname who have no permissions in a workspace.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| groupname required | string Specifies the group name for searching. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "workspace_id": 3,
- "workspace_name": "stats_test_workspace",
- "description": null,
- "permission": {
- "permission_id": 3,
- "name": "MANAGE",
- "description": "To list jobs, run jobs, edit Spark jobs, and manage workspace access by adding or removing permissions",
- "groups": [
- {
- "group_id": 75,
- "group_name": "G_Yeedu_Analyst",
- "group_mail": null,
- "group_object_id": "98b270e4-42fd-4969-8884-25a2e8384e53",
- "group_type": null
}
]
}
}, - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Match groupname for a specific Workspace ID.
Match groupname for a specific Workspace ID.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| groupname required | string Specifies the group name to get an exact match. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "group_id": 602,
- "group_name": "G_Yeedu_Auditor",
- "group_object_id": "ba7d4601-57-a055-953d800",
- "group_type": null,
- "from_date": "2024-06-20T11:25:39.573594+00:00",
- "to_date": "infinity"
}
]Get group's workspace permission.
Retrieve workspace permission of a group by a specific Workspace ID and group ID.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| group_id required | integer <int64> Specifies the ID of the group for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "group_id": "75",
- "group_name": "G_Yeedu_Analyst",
- "workspace": {
- "workspace_id": "3",
- "name": "stats_test_workspace",
- "description": null
}, - "tenant_id": "9d6d3054-a5f6-4dbf-86f9-26989eb73ed3",
- "group_permission": {
- "auth_workspace_group_id": "8",
- "permission": {
- "auth_workspace_perm_id": 3,
- "name": "MANAGE",
- "description": "To list jobs, run jobs, edit Spark jobs, and manage workspace access by adding or removing permissions"
}, - "created_by": {
- "user_id": "1",
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": "1",
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-21T18:14:17.877Z",
- "from_date": "2024-06-21T18:14:17.877Z",
- "to_date": null
}
}Assign a permission to a user for a specific workspace.
Provide access to a user for a workspace with a specific permission.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace where the permission will be granted. |
Request Body schema: application/jsonrequired
The workspace permission for the user to be added.
| user_id required | integer <int64> |
| permission_id required | integer <int64> |
Responses
Request samples
- Payload
{- "user_id": 1,
- "permission_id": 1
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "auth_workspace_user_id": "36",
- "workspace_id": "1",
- "auth_workspace_perm_id": 2,
- "user_id": "1",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T18:25:19.905Z",
- "from_date": "2024-06-21T18:25:19.905Z",
- "to_date": null
}Assign a permission to a group for a specific workspace.
Provide access to a group for a workspace with a specific permission.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace where the permission will be granted. |
Request Body schema: application/jsonrequired
The workspace permission for the group to be added.
| group_id required | integer <int64> |
| permission_id required | integer <int64> |
Responses
Request samples
- Payload
{- "group_id": 1,
- "permission_id": 1
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "auth_workspace_group_id": "9",
- "workspace_id": "1",
- "auth_workspace_perm_id": 2,
- "group_id": "1",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T18:27:27.518Z",
- "from_date": "2024-06-21T18:27:27.518Z",
- "to_date": null
}Revoke a workspace permission for a user.
This API removes a specific permission assigned to a user within a given workspace. You must specify the workspace ID, the user ID whose permission is to be revoked, and the permission ID itself.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace from which the permission will be revoked. |
| user_id required | integer <int64> Specifies the ID of the user for whom the permission will be revoked. |
| permission_id required | integer <int64> Specifies the ID of the permission to be revoked from the user. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted Permission Successfully."
}Revoke a workspace permission for a group.
This API removes a specific permission assigned to a group within a given workspace. You need to provide the workspace ID, the group ID whose permission is to be revoked, and the permission ID to be removed.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace from which the permission will be revoked. |
| group_id required | integer <int64> Specifies the ID of the group for whom the permission will be revoked. |
| permission_id required | integer <int64> Specifies the ID of the permission to be revoked from the group. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted Permission Successfully."
}Get all Spark jobs.
Retrieves a list of Spark jobs.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| enable | boolean Enum: true false Specifies which Spark jobs to list.
Note: If unspecified, all Spark jobs (both active and disabled) will be listed. |
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
| job_type | Array of strings Items Enum: "SPARK_JOB" "SPARK_SQL" "THRIFT_SQL" "YEEDU_FUNCTIONS" An optional set of Spark job types to filter Spark jobs. |
| job_type_langs | Array of strings Items Enum: "RAW_SCALA" "Jar" "Python3" "SQL" Specifies the languages of the Spark job for filtering. |
| has_run | boolean Enum: true false Specifies which Spark jobs to list.
Note: If unspecified, all Spark jobs (both runs and no runs) will be listed. |
| last_run_status | Array of strings (JobStatus) Items Enum: "SUBMITTED" "RUNNING" "DONE" "ERROR" "TERMINATED" "STOPPING" "STOPPED" Specifies the last run status of the Spark job for filtering. |
| created_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of created by user IDs to filter on. |
| modified_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of modified by user IDs to filter on. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "job_id": 23,
- "job_name": "spark_example",
- "spark_job_type": {
- "job_type": "SPARK_JOB",
- "language": "Jar"
}, - "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}
}, - "last_job_run": {
- "run_id": null,
- "run_status": null
}, - "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-20T17:24:19.782774+00:00",
- "from_date": "2024-06-20T17:24:19.782774+00:00",
- "to_date": "infinity"
}
]
}Search Spark jobs by job name.
Retrieves a list of Spark jobs based on a search by job name.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| job_name required | string Specifies the name of the Spark job to search for. |
| enable | boolean Enum: true false Specifies which Spark jobs to search.
Note: If unspecified, all Spark jobs (both active and disabled) will be searched. |
| has_run | boolean Enum: true false Specifies which Spark jobs to list.
Note: If unspecified, all Spark jobs (both runs and no runs) will be listed. |
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
| job_type | Array of strings Items Enum: "SPARK_JOB" "SPARK_SQL" "THRIFT_SQL" "YEEDU_FUNCTIONS" An optional set of Spark job types to filter Spark jobs. |
| job_type_langs | Array of strings Items Enum: "RAW_SCALA" "Jar" "Python3" "SQL" Specifies the languages of the Spark job for filtering. |
| last_run_status | Array of strings (JobStatus) Items Enum: "SUBMITTED" "RUNNING" "DONE" "ERROR" "TERMINATED" "STOPPING" "STOPPED" Specifies the last run status of the Spark job for filtering. |
| created_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of created by user IDs to filter on. |
| modified_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of modified by user IDs to filter on. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "job_id": 23,
- "job_name": "spark_example",
- "spark_job_type": {
- "job_type": "SPARK_JOB",
- "language": "Jar"
}, - "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}
}, - "last_job_run": {
- "run_id": null,
- "run_status": null
}, - "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-20T17:24:19.782774+00:00",
- "from_date": "2024-06-20T17:24:19.782774+00:00",
- "to_date": "infinity"
}
]
}Create a new Spark job.
Creates a Spark job with the provided configurations.
max_concurrency: Specifies the maximum number of concurrent Spark job runs allowed for a particular Spark job.
If
max_concurrencyis set to 0, an unlimited number of Spark job runs can be submitted using the same configuration.If
max_concurrencyis greater than 0, the number of job runs that can be submitted is limited to the specifiedmax_concurrencyvalue.If the job type is
THRIFT_SQLthenmax_concurrencyof 1 is only allowed
YEEDU_FUNCTIONS: If job type is
YEEDU_FUNCTIONS:yeedu_functions_max_request_concurrency: The value must be between 1 and 512.
yeedu_functions_request_timeout_sec: The value must be between 1 second and 900 seconds (1 second to 15 minutes).
yeedu_functions_idle_timeout_sec: The value must be between 300 seconds and 172800 seconds (5 minutes to 48 hours).
should_append_params: Determines whether the job-level Spark configuration should append to or override the cluster-level Spark configuration.
If set to
true, the job's Spark configuration is appended to the cluster's Spark configuration. This applies to fields such as--conf,--jars, and--packages.- For example, if the cluster is configured with
--packages=org.postgresql:postgresql:42.2.20and the job specifies--packages=org.duckdb:duckdb_jdbc:0.9.1, the resulting configuration will be:
- For example, if the cluster is configured with
--packages=org.duckdb:duckdb_jdbc:0.9.1,org.postgresql:postgresql:42.2.20.- If set to
false, the job's Spark configuration overrides the cluster's Spark configuration.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
Request Body schema: application/jsonrequired
The Spark job to be added.
| cluster_id | integer <int64> |
| cluster_name | string non-empty |
| max_concurrency | integer <int64> Default: 0 |
| name required | string non-empty The
|
| files | Array of strings unique [ items non-empty ] |
| properties_file | Array of strings unique [ items non-empty ] |
| conf | Array of strings unique [ items non-empty ] |
| packages | Array of strings unique [ items non-empty ] |
| repositories | Array of strings unique [ items non-empty ] |
| jars | Array of strings unique [ items non-empty ] |
| archives | Array of strings unique [ items non-empty ] |
| driver_memory | string or null^(?=.*[1-9])\d+(?![\d.])(?=\D) The memory cannot be negative and decimal. |
| driver_java_options | string or null non-empty |
| driver_library_path | string or null non-empty |
| driver_class_path | string or null non-empty |
| executor_memory | string or null^(?=.*[1-9])\d+(?![\d.])(?=\D) The memory cannot be negative and decimal. |
| principal | string or null non-empty |
| keytab | string or null non-empty |
| queue | string or null non-empty |
| job_class_name | string or null non-empty |
| job_command | string or null non-empty |
| job_arguments | string or null non-empty |
| job_rawScalaCode | string or null non-empty |
| job_type required | string Enum: "RAW_SCALA" "Jar" "Python3" "SQL" "THRIFT_SQL" "YEEDU_FUNCTIONS" |
| job_timeout_min | integer or null <int64> >= 1 |
| extra_info | object |
| driver_cores | integer or null <int32> >= 1 |
| total_executor_cores | integer or null <int64> >= 1 |
| executor_cores | integer or null <int32> >= 1 |
| num_executors | integer or null <int32> >= 1 |
| should_append_params | boolean Default: false If set to |
| yeedu_functions_project_path | string or null [ 1 .. 5000 ] characters |
| yeedu_functions_script_path | string or null <= 5000 characters |
| yeedu_functions_function_name | string or null [ 1 .. 5000 ] characters |
| yeedu_functions_requirements | string or null [ 1 .. 5000 ] characters |
| yeedu_functions_max_request_concurrency | integer or null <int64> [ 1 .. 512 ] |
| yeedu_functions_idle_timeout_sec | integer or null <int64> [ 300 .. 172800 ] |
| yeedu_functions_request_timeout_sec | integer or null <int64> [ 1 .. 900 ] |
| yeedu_functions_example_request_body | string or null non-empty |
Responses
Request samples
- Payload
{- "name": "spark_job_examples",
- "cluster_id": 81,
- "max_concurrency": 100,
- "job_class_name": "org.apache.spark.examples.SparkPi",
- "job_command": "file:///yeedu/object-storage-manager/spark-examples_2.12-3.2.2.jar",
- "job_arguments": "500",
- "job_rawScalaCode": null,
- "job_type": "Jar",
- "job_timeout_min": null,
- "files": [ ],
- "properties_file": [ ],
- "conf": [ ],
- "packages": [ ],
- "repositories": [ ],
- "jars": [ ],
- "archives": [ ],
- "driver_memory": null,
- "driver_java_options": null,
- "driver_library_path": null,
- "driver_class_path": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "principal": null,
- "keytab": null,
- "queue": null,
- "should_append_params": false
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "job_id": "1",
- "name": "spark_examples",
- "cluster_id": "1",
- "workspace_id": "1",
- "spark_job_type_lang_id": 1,
- "max_concurrency": "100",
- "job_class_name": "org.apache.spark.examples.SparkPi",
- "job_command": "file:///yeedu/object-storage-manager/spark-examples_2.12-3.2.2.jar",
- "job_arguments": "500",
- "job_rawScalaCode": null,
- "files": null,
- "properties_file": null,
- "conf": null,
- "packages": null,
- "repositories": null,
- "jars": null,
- "archives": null,
- "driver_memory": null,
- "driver_java_options": null,
- "driver_library_path": null,
- "driver_class_path": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false,
- "principal": null,
- "keytab": null,
- "queue": null,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-14T06:36:57.297Z",
- "from_date": "2024-06-14T06:29:47.214Z",
- "to_date": null
}Get details of a specific Spark job.
Retrieve Spark job details filtered by ID or name.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| job_id | integer <int64> Specifies the ID of the Spark job for filtering. |
| job_name | string Specifies the name of the Spark job for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "job_id": 23,
- "job_name": "spark_example",
- "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}, - "metastore_catalog_id": 1
}, - "spark_job_type": {
- "job_type": "SPARK_JOB",
- "language": "Jar"
}, - "max_concurrency": 0,
- "job_class_name": "org.apache.spark.examples.SparkPi",
- "job_command": "file:///yeedu/object-storage-manager/spark-examples_2.12-3.2.2.jar",
- "job_arguments": "1000",
- "job_rawScalaCode": null,
- "job_timeout_min": null,
- "files": null,
- "properties_file": null,
- "conf": null,
- "packages": null,
- "repositories": null,
- "jars": null,
- "archives": null,
- "driver_memory": null,
- "driver_java_options": null,
- "driver_library_path": null,
- "driver_class_path": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false,
- "principal": null,
- "keytab": null,
- "queue": null,
- "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-20T17:36:11.084+00:00",
- "from_date": "2024-06-20T17:24:19.782774+00:00",
- "to_date": "infinity"
}Update details of a specific Spark job.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| job_id | integer <int64> Specifies the ID of the Spark job for modification. |
| job_name | string Specifies the name of the Spark job for modification. |
Request Body schema: application/jsonrequired
Spark job details to be modified.
| name | string non-empty The
|
| cluster_id | integer or null <int64> |
| cluster_name | string or null |
| max_concurrency | integer <int64> |
| files | Array of strings or null unique |
| properties_file | Array of strings or null unique |
| conf | Array of strings or null unique |
| packages | Array of strings or null unique |
| repositories | Array of strings or null unique |
| jars | Array of strings or null unique |
| archives | Array of strings or null unique |
| driver_memory | string or null^(?=.*[1-9])\d+(?![\d.])(?=\D) The memory cannot be negative and decimal. |
| driver_java_options | string or null non-empty |
| driver_library_path | string or null non-empty |
| driver_class_path | string or null non-empty |
| executor_memory | string or null^(?=.*[1-9])\d+(?![\d.])(?=\D) The memory cannot be negative and decimal. |
| principal | string or null non-empty |
| keytab | string or null non-empty |
| queue | string or null non-empty |
| job_class_name | string or null non-empty |
| job_command | string or null non-empty |
| job_arguments | string or null non-empty |
| job_rawScalaCode | string or null non-empty |
| job_timeout_min | integer or null <int64> >= 1 |
| driver_cores | integer or null <int32> >= 1 |
| total_executor_cores | integer or null <int64> >= 1 |
| executor_cores | integer or null <int32> >= 1 |
| num_executors | integer or null <int32> >= 1 |
| should_append_params | boolean If set to |
| yeedu_functions_project_path | string or null [ 1 .. 5000 ] characters |
| yeedu_functions_script_path | string or null <= 5000 characters |
| yeedu_functions_function_name | string or null [ 1 .. 5000 ] characters |
| yeedu_functions_requirements | string or null [ 1 .. 5000 ] characters |
| yeedu_functions_max_request_concurrency | integer or null <int64> [ 1 .. 512 ] |
| yeedu_functions_idle_timeout_sec | integer or null <int64> [ 300 .. 172800 ] |
| yeedu_functions_request_timeout_sec | integer or null <int64> [ 1 .. 900 ] |
| yeedu_functions_example_request_body | string or null non-empty |
Responses
Request samples
- Payload
{- "name": "spark_examples",
- "cluster_id": "13",
- "max_concurrency": 1,
- "job_class_name": "org.apache.spark.examples.SparkPi",
- "job_command": "file:///yeedu/object-storage-manager/spark-examples_2.11-2.4.8.jar",
- "job_arguments": "1000",
- "job_rawScalaCode": null,
- "job_timeout_min": null,
- "files": [
- "file:///yeedu/object-storage-manager/privateKey"
], - "properties_file": null,
- "conf": [
- "spark.sql.files.maxRecordsPerFile=1000000",
- "spark.ui.enabled=false",
- "spark.driver.yeedu_privateKey=file:///yeedu/object-storage-manager/privateKey"
], - "packages": [
- "com.github.music-of-the-ainur:almaren-framework_2.11:0.9.3-2.4",
- "org.postgresql:postgresql:42.2.8",
- "org.apache.hadoop:hadoop-aws:2.10.1",
- "org.apache.spark:spark-hive_2.11:2.4.8"
], - "repositories": null,
- "jars": [
- "file:///yeedu/object-storage-manager/gcs-connector-hadoop2-latest.jar",
- "file:///yeedu/object-storage-manager/ojdbc6-11.2.0.4.jar"
], - "archives": null,
- "driver_memory": null,
- "driver_java_options": "-Dderby.system.home=/yeedu/spark_metastores/$(date +%s)-${RANDOM}",
- "driver_library_path": null,
- "driver_class_path": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "principal": null,
- "keytab": null,
- "queue": null,
- "should_append_params": false
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "job_id": "1",
- "name": "spark_examples",
- "cluster_id": "1",
- "workspace_id": "1",
- "spark_job_type_lang_id": 1,
- "max_concurrency": "100",
- "job_class_name": "org.apache.spark.examples.SparkPi",
- "job_command": "file:///yeedu/object-storage-manager/spark-examples_2.12-3.2.2.jar",
- "job_arguments": "500",
- "job_rawScalaCode": null,
- "files": null,
- "properties_file": null,
- "conf": null,
- "packages": null,
- "repositories": null,
- "jars": null,
- "archives": null,
- "driver_memory": null,
- "driver_java_options": null,
- "driver_library_path": null,
- "driver_class_path": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false,
- "principal": null,
- "keytab": null,
- "queue": null,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-14T06:36:57.297Z",
- "from_date": "2024-06-14T06:29:47.214Z",
- "to_date": null
}Enable a specific Spark job.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| job_id | integer <int64> Specifies the ID of the Spark job to enable. |
| job_name | string Specifies the name of the Spark job to enable. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "job_id": "1",
- "name": "spark_examples",
- "cluster_id": "1",
- "workspace_id": "1",
- "spark_job_type_lang_id": 1,
- "max_concurrency": "100",
- "job_class_name": "org.apache.spark.examples.SparkPi",
- "job_command": "file:///yeedu/object-storage-manager/spark-examples_2.12-3.2.2.jar",
- "job_arguments": "500",
- "job_rawScalaCode": null,
- "files": null,
- "properties_file": null,
- "conf": null,
- "packages": null,
- "repositories": null,
- "jars": null,
- "archives": null,
- "driver_memory": null,
- "driver_java_options": null,
- "driver_library_path": null,
- "driver_class_path": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false,
- "principal": null,
- "keytab": null,
- "queue": null,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-14T06:36:57.297Z",
- "from_date": "2024-06-14T06:29:47.214Z",
- "to_date": null
}Disable a specific Spark job.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| job_id | integer <int64> Specifies the ID of the Spark job to disable. |
| job_name | string Specifies the name of the Spark job to disable. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "job_id": "1",
- "name": "spark_examples",
- "cluster_id": "1",
- "workspace_id": "1",
- "spark_job_type_lang_id": 1,
- "max_concurrency": "100",
- "job_class_name": "org.apache.spark.examples.SparkPi",
- "job_command": "file:///yeedu/object-storage-manager/spark-examples_2.12-3.2.2.jar",
- "job_arguments": "1000",
- "job_rawScalaCode": null,
- "files": null,
- "properties_file": null,
- "conf": null,
- "packages": null,
- "repositories": null,
- "jars": null,
- "archives": null,
- "driver_memory": null,
- "driver_java_options": null,
- "driver_library_path": null,
- "driver_class_path": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false,
- "principal": null,
- "keytab": null,
- "queue": null,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-14T06:36:57.297Z",
- "from_date": "2024-06-14T06:29:47.214Z",
- "to_date": "2024-06-14T06:36:57.297Z"
}Get all filter data for Spark jobs within a workspace.
Retrieves a list of:
- Engine cluster instances used by Spark jobs within a workspace.
- Users who created the Spark jobs.
- Users who modified the Spark jobs.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| enable | boolean Enum: true false Specifies which spark jobs to list.
Note: If unspecified, all spark jobs (both active and disabled) will be listed. |
| filter_type required | string Enum: "cluster" "created_by_user" "modified_by_user" Specifies the filter type for the data to retrieve. Choose one of the following:
|
| name | string Specifies the name to filter by, applicable to the chosen filter type. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "cluster_id": 1,
- "cluster_name": "azure_instance"
}
]
}Export Spark job details for a specific Spark job Id
The details of a Spark job are exported for a specific Spark job ID or name.
- The user must have at least one permission in the workspace from which the job is being exported.
- The exported job will be of type "SPARK_JOB", "SPARK_SQL", or "YEEDU_FUNCTIONS".
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| job_id | integer <int64> Specifies the ID of the Spark job for filtering. |
| job_name | string Specifies the name of the Spark job for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "jobs": [
- {
- "name": "spark_examples",
- "spark_job_type": {
- "job_type": "SPARK_JOB",
- "language": "Jar"
}, - "cluster_info": null,
- "max_concurrency": 0,
- "files": null,
- "properties_file": null,
- "conf": null,
- "packages": null,
- "repositories": null,
- "jars": null,
- "archives": null,
- "driver_memory": null,
- "driver_java_options": "-Dderby.system.home=/yeedu/spark_metastores/1721804846-27730",
- "driver_library_path": null,
- "driver_class_path": null,
- "executor_memory": null,
- "principal": null,
- "keytab": null,
- "queue": null,
- "job_class_name": "org.apache.spark.examples.SparkPi",
- "job_command": "file:///yeedu/object-storage-manager/spark-examples_2.11-2.4.8.jar",
- "job_arguments": "250",
- "job_rawScalaCode": null,
- "job_timeout_min": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false
}
]
}Get all Spark job runs.
Retrieves a list of Spark job runs.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
| job_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of Spark job IDs to filter on. |
| run_status | Array of strings (JobStatus) Items Enum: "SUBMITTED" "RUNNING" "DONE" "ERROR" "TERMINATED" "STOPPING" "STOPPED" Specifies the status of Spark job run for filtering. |
| job_type | Array of strings Items Enum: "SPARK_JOB" "SPARK_SQL" "NOTEBOOK" "THRIFT_SQL" "YEEDU_FUNCTIONS" Specifies the job type of the Spark job run for filtering. |
| job_type_langs | Array of strings Items Enum: "RAW_SCALA" "Jar" "Python3" "Scala" "SQL" An optional set of language filter for Spark job runs. |
| created_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of created by user IDs to filter on. |
| modified_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of modified by user IDs to filter on. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "run_id": 298,
- "application_id": "local-1718959721329",
- "run_status": "DONE",
- "total_run_time_sec": 8.105467,
- "execution_time_sec": 6.300996,
- "runtime_arguments": {
- "arguments": "1000",
- "conf": [
- "spark.executor.memory=4g"
]
}, - "job_conf": {
- "job_id": 23,
- "job_name": "spark_example",
- "spark_job_type": {
- "job_type": "SPARK_JOB",
- "language": "Jar"
}, - "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}
}
}, - "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by": {
- "user_id": 3,
- "username": "rp0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 3,
- "username": "rp0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-21T08:48:37.459181+00:00",
- "from_date": "2024-06-21T08:48:37.459181+00:00",
- "to_date": "2024-06-21T08:48:45.564648+00:00"
}
]
}Search Spark job runs by job name.
Retrieves a list of Spark job runs based on a search by job name.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
| job_name required | string Specifies the name of the Spark job to search for. |
| job_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of Spark job IDs to filter on. |
| run_status | Array of strings (JobStatus) Items Enum: "SUBMITTED" "RUNNING" "DONE" "ERROR" "TERMINATED" "STOPPING" "STOPPED" Specifies the status of Spark job run for filtering. |
| job_type | Array of strings Items Enum: "SPARK_JOB" "SPARK_SQL" "NOTEBOOK" "THRIFT_SQL" "YEEDU_FUNCTIONS" Specifies the job type of the Spark job run for filtering. |
| job_type_langs | Array of strings Items Enum: "RAW_SCALA" "Jar" "Python3" "Scala" "SQL" An optional set of language filter for Spark job runs. |
| created_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of created by user IDs to filter on. |
| modified_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of modified by user IDs to filter on. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "run_id": 298,
- "application_id": "local-1718959721329",
- "run_status": "DONE",
- "total_run_time_sec": 8.105467,
- "execution_time_sec": 6.300996,
- "runtime_arguments": {
- "arguments": "1000",
- "conf": [
- "spark.executor.memory=4g"
]
}, - "job_conf": {
- "job_id": 23,
- "job_name": "spark_example",
- "spark_job_type": {
- "job_type": "SPARK_JOB",
- "language": "Jar"
}, - "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}
}
}, - "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by": {
- "user_id": 3,
- "username": "rp0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 3,
- "username": "rp0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-21T08:48:37.459181+00:00",
- "from_date": "2024-06-21T08:48:37.459181+00:00",
- "to_date": "2024-06-21T08:48:45.564648+00:00"
}
]
}Create a new Spark job run.
A Spark job run for a specific configuration can only be created if the attached cluster instance is not in ERROR, DESTROYING states or is Disabled.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
Request Body schema: application/jsonrequired
The Spark job to be used.
| job_id | integer <int64> |
| job_name | string non-empty |
| arguments | string or null non-empty |
| conf | Array of strings unique [ items non-empty ] |
Responses
Request samples
- Payload
{- "job_id": 1,
- "arguments": "1000",
- "conf": [
- "spark.executor.memory=4g"
]
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "run_id": "297",
- "job_id": "23",
- "cluster_id": "13",
- "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by_user_id": "3",
- "modified_by_user_id": "3",
- "last_update_date": "2024-06-21T08:47:55.301Z",
- "from_date": "2024-06-21T08:47:55.301Z",
- "to_date": null
}Get details of a specific Spark job run.
Retrieve Spark job run instance details by ID.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the Spark job run for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "run_id": 298,
- "application_id": "local-1718959721329",
- "run_status": "DONE",
- "execution_time_sec": 6.300996,
- "total_run_time_sec": 8.105467,
- "job_conf": {
- "job_id": 23,
- "job_name": "spark_example",
- "spark_job_type": {
- "job_type": "SPARK_JOB",
- "language": "Jar"
}, - "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}
}
}, - "workflow_job_instance_details": {
- "workflow_job_instance_status": {
- "workflow_job_instance_id": 354,
- "workflow_job_id": 354,
- "status": "DONE",
- "from_date": "2024-06-21T08:48:37.459181+00:00",
- "to_date": "2024-06-21T08:48:45.564648+00:00"
}, - "workflow_execution_process": {
- "machine_pid_number": "1580",
- "machine_hostname": "yeedu13-c9b7944b-b5c1-33ad-6055-78cc47b1769a",
- "machine_id": "c9b7944b-b5c1-33ad-6055-78cc47b1769a",
- "machine_pid_user": "root",
- "machine_node_number": "0"
}
}, - "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by": {
- "user_id": 3,
- "username": "rp0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 3,
- "username": "rp0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-21T08:48:37.459181+00:00",
- "from_date": "2024-06-21T08:48:37.459181+00:00",
- "to_date": "2024-06-21T08:48:45.564648+00:00"
}Get Spark job run status details.
Retrieve Spark job run details in the sequential order of state transitions during execution.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the Spark job run for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "run_status": "SUBMITTED",
- "created_by": "rp0000-yeedu@yeedu.io",
- "start_time": "2024-06-21T08:48:37.459181+00:00",
- "end_time": "2024-06-21T08:48:38.003953+00:00"
}, - {
- "run_status": "RUNNING",
- "created_by": "rp0000-yeedu@yeedu.io",
- "start_time": "2024-06-21T08:48:38.003953+00:00",
- "end_time": "2024-06-21T08:48:44.304949+00:00"
}, - {
- "run_status": "STOPPING",
- "created_by": "rp0000-yeedu@yeedu.io",
- "start_time": "2024-06-21T08:48:44.304949+00:00",
- "end_time": "2024-06-21T08:48:45.564648+00:00"
}, - {
- "run_status": "STOPPED",
- "created_by": "rp0000-yeedu@yeedu.io",
- "start_time": "2024-06-21T08:48:45.564648+00:00",
- "end_time": "infinity"
}
]Stop a Spark job run.
A Spark job run can only be stopped if its status is SUBMITTED or RUNNING.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the Spark job run to stop. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "SparkKill": {
- "workflow_job_id": 354,
- "workflow_job_instance_id": 354,
- "spark_job_instance_id": 298,
- "spark_job_id": 23,
- "compute_engine_id": 13
}
}Create a thrift server proxy for a spark job run.
Creates a proxy to establish a connection to thrift server for a specific job run, provided the job run is in the RUNNING state.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the Spark job run to create proxy for. |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "message": "Created Proxy for Spark job run Id: 108."
}Get workflow job execution details of a Spark job run.
Retrieve workflow job execution details for a specific Spark application ID.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| application_id required | string Specifies the application ID of the Spark job run for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "run_id": 298,
- "application_id": "local-1718959721329",
- "run_status": "DONE",
- "compute_engine": 13,
- "cluster_id": "13",
- "workflow_job_instance_details": {
- "workflow_job_instance_status": {
- "workflow_job_instance_id": 354,
- "workflow_job_id": 354,
- "status": "DONE",
- "previous_status": [
- "NONE",
- "INIT",
- "SENT",
- "RECEIVED",
- "EXECUTING"
], - "from_date": "2024-06-21T08:48:45.564648+00:00",
- "to_date": "infinity"
}, - "workflow_execution_process": {
- "machine_pid_number": "1580",
- "machine_hostname": "yeedu13-c9b7944b-b5c1-33ad-6055-78cc47b1769a",
- "machine_id": "c9b7944b-b5c1-33ad-6055-78cc47b1769a",
- "machine_pid_user": "root",
- "machine_node_number": "0"
}
}, - "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by": {
- "user_id": 3,
- "username": "rp0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 3,
- "username": "rp0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-21T08:48:37.459181+00:00",
- "from_date": "2024-06-21T08:48:37.459181+00:00",
- "to_date": "2024-06-21T08:48:45.564648+00:00"
}Download Spark job run logs.
Downloads logs of a Spark job run by specifying the run ID and log type.
- Provide
last_n_linesto fetch only the last N lines of the log. - Provide
file_size_bytesto download only the first N bytes of the log.
If neither option is specified, the complete log file will be downloaded.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the Spark job run for filtering. |
| log_type required | string Enum: "stdout" "stderr" The type of log file to filter. |
query Parameters
| last_n_lines | integer <int32> [ 1 .. 1000 ] Number of lines to retrieve from the end of the log file (sample preview). |
| file_size_bytes | integer <int64> >= 1 Number of bytes to preview from the beginning of the log file (sample preview). |
Responses
Response samples
- 400
- 401
- 404
- 500
{- "error_code": "string",
- "error_message": "string"
}Get workflow errors of a Spark job run.
Retrieve workflow errors for a specific Spark job run.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the Spark job run for filtering. |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "error": "OnError(Process finish with exit code 137,None,None,None,Some(akka.stream.alpakka.amqp.impl.AmqpSourceStage$$anon$1$$anon$2$$anon$3@47b8c55f))"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 10
}
}Get all filter data for Spark job run within a workspace.
Retrieves Spark job run data, including:
- Engine cluster instances used by Spark jobs in the workspace.
- Spark jobs executed within the workspace.
- Users who created the Spark job runs.
- Users who modified the Spark job runs.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| filter_type required | string Enum: "cluster" "job_conf" "created_by_user" "modified_by_user" Specifies the filter type for the data to retrieve. Choose one of the following:
|
| job_id | integer <int64> Specifies the ID of the Spark job for filtering. |
| name | string Specifies the name to filter by, applicable to the chosen filter type. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "cluster_id": 1,
- "cluster_name": "azure_instance"
}
]
}Get all notebooks.
Retrieves a list of notebooks.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| enable | boolean Enum: true false Specifies which notebooks to list.
Note: If unspecified, all notebooks (both active and disabled) will be listed. |
| has_run | boolean Enum: true false Specifies which Notebook configurations to list.
Note: If unspecified, all Notebook configurations (both runs and no runs) will be listed. |
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
| job_type_langs | Array of strings Items Enum: "Python3" "Scala" "SQL" Specifies the languages of the notebooks for filtering. |
| last_run_status | Array of strings (JobStatus) Items Enum: "SUBMITTED" "RUNNING" "DONE" "ERROR" "TERMINATED" "STOPPING" "STOPPED" Specifies the last run status of the notebook for filtering. |
| created_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of created by user IDs to filter on. |
| modified_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of modified by user IDs to filter on. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "notebook_id": 27,
- "notebook_name": "test_notebook",
- "spark_job_type": {
- "job_type": "NOTEBOOK",
- "language": "Python3"
}, - "notebook_file_id": "1",
- "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}
}, - "last_notebook_run": {
- "run_id": 299,
- "run_status": "RUNNING"
}, - "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-21T09:14:40.88+00:00",
- "from_date": "2024-06-21T09:09:30.277914+00:00",
- "to_date": "infinity"
}
]
}Search notebooks by notebook name.
Retrieves a list of notebooks based on a search by notebook name.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_name required | string Specifies the name of the notebook to search for. |
| enable | boolean Enum: true false Specifies which notebooks to search.
Note: If unspecified, all notebooks (both active and disabled) will be searched. |
| has_run | boolean Enum: true false Specifies which Notebook configurations to list.
Note: If unspecified, all Notebook configurations (both runs and no runs) will be listed. |
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
| job_type_langs | Array of strings Items Enum: "Python3" "Scala" "SQL" Specifies the languages of the notebooks for filtering. |
| last_run_status | Array of strings (JobStatus) Items Enum: "SUBMITTED" "RUNNING" "DONE" "ERROR" "TERMINATED" "STOPPING" "STOPPED" Specifies the last run status of the notebook for filtering. |
| created_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of created by user IDs to filter on. |
| modified_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of modified by user IDs to filter on. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "notebook_id": 27,
- "notebook_name": "test_notebook",
- "spark_job_type": {
- "job_type": "NOTEBOOK",
- "language": "Python3"
}, - "notebook_file_id": "1",
- "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}
}, - "last_notebook_run": {
- "run_id": 299,
- "run_status": "RUNNING"
}, - "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-21T09:14:40.88+00:00",
- "from_date": "2024-06-21T09:09:30.277914+00:00",
- "to_date": "infinity"
}
]
}Create a new notebook.
Creates a notebook with the provided configurations.
max_concurrency: Specifies the maximum number of concurrent notebook runs allowed for the configuration.
- By default,
max_concurrencyis set to 1 and is non-editable for a notebook.
- By default,
should_append_params: Determines whether the job-level Spark configuration should append to or override the cluster-level Spark configuration.
If set to
true, the job's Spark configuration is appended to the cluster's Spark configuration. This applies to fields such as--conf,--jars, and--packages.- For example, if the cluster is configured with
--packages=org.postgresql:postgresql:42.2.20and the job specifies--packages=org.duckdb:duckdb_jdbc:0.9.1, the resulting configuration will be:
- For example, if the cluster is configured with
--packages=org.duckdb:duckdb_jdbc:0.9.1,org.postgresql:postgresql:42.2.20.- If set to
false, the job's Spark configuration overrides the cluster's Spark configuration.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
Request Body schema: application/jsonrequired
The notebook to be added.
| cluster_id | integer <int64> |
| cluster_name | string non-empty |
| notebook_name required | string non-empty |
| notebook_type required | string Enum: "python3" "scala" "sql" |
| notebook_path | string non-empty |
| conf | Array of strings unique [ items non-empty ] |
| packages | Array of strings unique [ items non-empty ] |
| jars | Array of strings unique [ items non-empty ] |
| files | Array of strings unique [ items non-empty ] |
| driver_memory | string or null^(?=.*[1-9])\d+(?![\d.])(?=\D) The memory cannot be negative and decimal. |
| executor_memory | string or null^(?=.*[1-9])\d+(?![\d.])(?=\D) The memory cannot be negative and decimal. |
| driver_cores | integer or null <int32> >= 1 |
| total_executor_cores | integer or null <int64> >= 1 |
| executor_cores | integer or null <int32> >= 1 |
| num_executors | integer or null <int32> >= 1 |
| should_append_params | boolean Default: false If set to |
Responses
Request samples
- Payload
{- "notebook_name": "test_notebook",
- "cluster_name": "yeedu_instance",
- "notebook_type": "python3",
- "notebook_path": "/foo/bar",
- "conf": [
- "spark.sql.files.maxRecordsPerFile=1000000",
- "spark.ui.enabled=false",
- "spark.driver.yeedu_privateKey=file:///yeedu/object-storage-manager/privateKey"
], - "packages": [
- "com.github.music-of-the-ainur:almaren-framework_2.11:0.9.3-2.4",
- "org.postgresql:postgresql:42.2.8",
- "org.apache.hadoop:hadoop-aws:2.10.1",
- "org.apache.spark:spark-hive_2.11:2.4.8"
], - "jars": [
- "file:///yeedu/object-storage-manager/gcs-connector-hadoop2-latest.jar",
- "file:///yeedu/object-storage-manager/ojdbc6-11.2.0.4.jar"
]
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "notebook_id": "27",
- "notebook_name": "test_notebook",
- "cluster_id": "13",
- "workspace_id": "5",
- "spark_job_type_lang_id": 3,
- "max_concurrency": "1",
- "notebook_file_id": "1",
- "conf": null,
- "packages": null,
- "jars": null,
- "files": null,
- "driver_memory": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T09:09:30.277Z",
- "from_date": "2024-06-21T09:09:30.277Z",
- "to_date": null
}Get details of a specific notebook.
Retrieve notebook details filtered by ID or name.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook for filtering. |
| notebook_name | string Specifies the name of the notebook for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "notebook_id": 27,
- "notebook_name": "test_notebook",
- "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "metastore_catalog": {
- "metastore_catalog_id": 4,
- "metastore_catalog_name": "aws_unity_catalog",
- "description": null,
- "metastore_catalog_type": {
- "metastore_catalog_type_id": 2,
- "name": "DATABRICKS UNITY",
- "description": null
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}
}, - "spark_job_type": {
- "job_type": "NOTEBOOK",
- "language": "Python3"
}, - "max_concurrency": 1,
- "notebook_file_id": "1",
- "files": null,
- "conf": null,
- "packages": null,
- "jars": null,
- "driver_memory": null,
- "driver_java_options": null,
- "driver_library_path": null,
- "driver_class_path": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false,
- "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-21T09:09:30.277914+00:00",
- "from_date": "2024-06-21T09:09:30.277914+00:00",
- "to_date": "infinity"
}Update details of a specific notebook.
Modify an existing notebook identified by ID or name.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook for modification. |
| notebook_name | string Specifies the name of the notebook for modification. |
Request Body schema: application/jsonrequired
Notebook details to be modified.
| notebook_name | string non-empty |
| cluster_name | string or null non-empty |
| cluster_id | integer or null <int64> |
| conf | Array of strings or null unique |
| packages | Array of strings or null unique |
| jars | Array of strings or null unique |
| files | Array of strings or null unique |
| driver_memory | string or null^(?=.*[1-9])\d+(?![\d.])(?=\D) The memory cannot be negative and decimal. |
| executor_memory | string or null^(?=.*[1-9])\d+(?![\d.])(?=\D) The memory cannot be negative and decimal. |
| driver_cores | integer or null <int32> >= 1 |
| total_executor_cores | integer or null <int64> >= 1 |
| executor_cores | integer or null <int32> >= 1 |
| num_executors | integer or null <int32> >= 1 |
| should_append_params | boolean If set to |
Responses
Request samples
- Payload
{- "notebook_name": "test_notebook",
- "cluster_id": "13",
- "conf": null,
- "packages": null,
- "jars": null,
- "files": null,
- "driver_memory": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "notebook_id": "27",
- "notebook_name": "test_notebook",
- "cluster_id": "13",
- "workspace_id": "5",
- "spark_job_type_lang_id": 3,
- "max_concurrency": "1",
- "notebook_file_id": "1",
- "conf": null,
- "packages": null,
- "jars": null,
- "files": null,
- "driver_memory": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T09:09:30.277Z",
- "from_date": "2024-06-21T09:09:30.277Z",
- "to_date": null
}Enable a specific notebook.
Enable an existing Notebook.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook to enable. |
| notebook_name | string Specifies the name of the notebook to enable. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "notebook_id": "27",
- "notebook_name": "test_notebook",
- "cluster_id": "13",
- "workspace_id": "5",
- "spark_job_type_lang_id": 3,
- "max_concurrency": "1",
- "notebook_file_id": "1",
- "conf": null,
- "packages": null,
- "jars": null,
- "files": null,
- "driver_memory": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T09:09:30.277Z",
- "from_date": "2024-06-21T09:09:30.277Z",
- "to_date": null
}Disable a specific notebook.
Disable an existing Notebook
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook to disable. |
| notebook_name | string Specifies the name of the notebook to disable. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "notebook_id": "26",
- "notebook_name": "test_notebook",
- "cluster_id": "13",
- "workspace_id": "5",
- "spark_job_type_lang_id": 3,
- "max_concurrency": "1",
- "notebook_file_id": "1",
- "conf": null,
- "packages": null,
- "jars": null,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T09:25:58.823Z",
- "from_date": "2024-06-21T09:09:11.436Z",
- "to_date": "2024-06-21T09:25:58.823Z"
}Update notebook cell data for a specific notebook.
Updates the cell data within a notebook identified by its ID.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| notebook_id required | integer <int64> Specifies the ID of the notebook for modification. |
query Parameters
| notebook_directory_path | string non-empty Specifies the path of the directory where the notebook should be saved |
Request Body schema: application/jsonrequired
Notebook cell data to be updated.
required | object |
| nbformat required | integer Value: 4 Notebook format version |
| nbformat_minor required | integer Enum: 0 1 2 3 4 5 6 7 Minor version of the notebook format |
required | Array of objects List of cells in the notebook |
Responses
Request samples
- Payload
{- "metadata": {
- "signature": "hex-digest",
- "kernel_info": {
- "name": "python3"
}, - "language_info": {
- "name": "python",
- "version": "3.9",
- "codemirror_mode": "python"
}
}, - "nbformat": 4,
- "nbformat_minor": 0,
- "cells": [
- {
- "cell_type": "code",
- "cell_uuid": "1b22dfa2-a029-4c31-a06a-d81220303e10",
- "execution_count": 1,
- "metadata": {
- "order": 0,
- "collapsed": true,
- "scrolled": "auto",
- "isLastRunTime": "",
- "isCodeModified": false,
- "deletable": false,
- "editable": false,
- "format": "text/plain",
- "name": "example-cell",
- "tags": [
- "example",
- "test"
], - "jupyter": {
- "source_hidden": false,
- "outputs_hidden": false
}, - "execution": {
- "iopub.execute_input": "2025-03-04T10:00:00Z",
- "iopub.status.busy": "2025-03-04T10:00:01Z",
- "shell.execute_reply": "2025-03-04T10:00:02Z",
- "iopub.status.idle": "2025-03-04T10:00:03Z"
}
}, - "source": [
- "print('Hello, world!')"
], - "outputs": [
- {
- "output_type": "stream",
- "name": "stdout",
- "text": [
- "Hello, world!\n"
]
}
]
}
]
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "notebook_id": "27",
- "notebook_name": "test_notebook",
- "cluster_id": "13",
- "workspace_id": "5",
- "spark_job_type_lang_id": 3,
- "max_concurrency": "1",
- "notebook_file_id": "1",
- "conf": null,
- "packages": null,
- "jars": null,
- "files": null,
- "driver_memory": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T09:09:30.277Z",
- "from_date": "2024-06-21T09:09:30.277Z",
- "to_date": null
}Get all filter data for notebooks within a workspace.
Retrieves a list of:
- Engine cluster instances used by notebooks within a workspace.
- Users who created the notebooks.
- Users who modified the notebooks.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| enable | boolean Enum: true false Specifies which notebook configuration to list.
Note: If unspecified, all notebook configuration (both active and disabled) will be listed. |
| filter_type required | string Enum: "cluster" "created_by_user" "modified_by_user" Specifies the filter type for the data to retrieve. Choose one of the following:
|
| name | string Specifies the name to filter by, applicable to the chosen filter type. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "cluster_id": 1,
- "cluster_name": "azure_instance"
}
]
}Export Notebook details for a specific Notebook Id
Notebook details are exported for a specific Notebook Id and returned in the form of JSON
- The user needs to have at least one permission in the workspace from which the notebook is being exported.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook for filtering. |
| notebook_name | string Specifies the name of the notebook for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "notebooks": [
- {
- "name": "test_notebook",
- "spark_job_type": {
- "job_type": "NOTEBOOK",
- "language": "Python3"
}, - "cluster_info": null,
- "notebook_file_id": "1",
- "conf": null,
- "packages": null,
- "jars": null,
- "files": null,
- "driver_memory": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false
}
]
}Clone a notebook.
Clones an existing notebook identified by its ID or name.
- If
clone_file_pathis not provided, the notebook will be cloned to its existing notebook_file_path with specifiednew_notebook_name.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook to be cloned. |
| notebook_name | string Specifies the name of the notebook to be cloned. |
| new_notebook_name required | string Specifies the name of the new notebook to be cloned. |
| clone_file_path | string non-empty Specifies the path of the directory where the notebook should be cloned |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "notebook_id": "27",
- "notebook_name": "test_notebook",
- "cluster_id": "13",
- "workspace_id": "5",
- "spark_job_type_lang_id": 3,
- "max_concurrency": "1",
- "notebook_file_id": "1",
- "conf": null,
- "packages": null,
- "jars": null,
- "files": null,
- "driver_memory": null,
- "executor_memory": null,
- "driver_cores": null,
- "total_executor_cores": null,
- "executor_cores": null,
- "num_executors": null,
- "should_append_params": false,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T09:09:30.277Z",
- "from_date": "2024-06-21T09:09:30.277Z",
- "to_date": null
}Get all notebook runs.
Retrieves a list of notebook runs.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
| notebook_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of Notebook IDs to filter on. |
| run_status | Array of strings (JobStatus) Items Enum: "SUBMITTED" "RUNNING" "DONE" "ERROR" "TERMINATED" "STOPPING" "STOPPED" Specifies the status of Notebook run for filtering. |
| job_type_langs | Array of strings Items Enum: "Python3" "Scala" "SQL" Specifies the languages of the notebooks for filtering. |
| created_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of created by user IDs to filter on. |
| modified_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of modified by user IDs to filter on. |
| isActive | boolean Enum: true false Specifies whether to list only active notebook runs that are in the RUNNING or SUBMITTED state. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "run_id": 299,
- "run_status": "STOPPED",
- "runtime_arguments": {
- "arguments": "1000",
- "conf": [
- "spark.executor.memory=4g"
]
}, - "notebook_conf": {
- "notebook_id": 27,
- "notebook_name": "Notebook_01",
- "notebook_job_type": {
- "job_type": "NOTEBOOK",
- "language": "Python3"
}, - "notebook_file_id": "1",
- "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}, - "metastore_catalog_id": 1
}
}, - "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-21T09:13:47.789222+00:00",
- "from_date": "2024-06-21T09:13:47.789222+00:00",
- "to_date": "2024-06-21T09:20:26.56676+00:00"
}
]
}Search notebook runs by notebook name.
Retrieves a list of notebook runs based on a search by notebook name.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_name required | string Notebook Name that will be used for filter |
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
| notebook_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of Notebook IDs to filter on. |
| run_status | Array of strings (JobStatus) Items Enum: "SUBMITTED" "RUNNING" "DONE" "ERROR" "TERMINATED" "STOPPING" "STOPPED" Specifies the status of Notebook run for filtering. |
| job_type_langs | Array of strings Items Enum: "Python3" "Scala" "SQL" Specifies the languages of the notebooks for filtering. |
| created_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of created by user IDs to filter on. |
| modified_by_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of modified by user IDs to filter on. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "run_id": 299,
- "run_status": "STOPPED",
- "runtime_arguments": {
- "arguments": "1000",
- "conf": [
- "spark.executor.memory=4g"
]
}, - "notebook_conf": {
- "notebook_id": 27,
- "notebook_name": "Notebook_01",
- "notebook_job_type": {
- "job_type": "NOTEBOOK",
- "language": "Python3"
}, - "notebook_file_id": "1",
- "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}, - "metastore_catalog_id": 1
}
}, - "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-21T09:13:47.789222+00:00",
- "from_date": "2024-06-21T09:13:47.789222+00:00",
- "to_date": "2024-06-21T09:20:26.56676+00:00"
}
]
}Create a new notebook run.
A notebook run for a specific configuration can only be created if the attached cluster instance is not in ERROR, DESTROYING, or DESTROYED states.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
Request Body schema: application/jsonrequired
The notebook to be used.
| notebook_id | integer <int64> |
| notebook_name | string non-empty |
| arguments | string or null non-empty |
| conf | Array of strings unique [ items non-empty ] |
Responses
Request samples
- Payload
{- "notebook_id": 1,
- "arguments": "1000",
- "conf": [
- "spark.executor.memory=4g"
]
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "run_id": "301",
- "notebook_id": "27",
- "arguments": "1000",
- "conf": [
- "spark.executor.memory=4g"
], - "cluster_id": "13",
- "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T10:27:43.768Z",
- "from_date": "2024-06-21T10:27:43.768Z",
- "to_date": null
}Get details of a specific notebook run.
Retrieve notebook run configuration details by ID.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the notebook run for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "run_id": 299,
- "run_status": "STOPPED",
- "execution_time_sec": 397.906651,
- "total_run_time_sec": 398.777538,
- "runtime_arguments": {
- "arguments": "1000",
- "conf": [
- "spark.executor.memory=4g"
]
}, - "notebook_conf": {
- "notebook_id": 27,
- "notebook_name": "Notebook_01",
- "notebook_job_type": {
- "job_type": "NOTEBOOK",
- "language": "Python3"
}, - "notebook_file_id": "1",
- "cluster_info": {
- "cluster_id": 13,
- "name": "gcp_test",
- "cluster_status": "RUNNING",
- "cluster_type": "YEEDU",
- "instance_size": 1,
- "min_instances": 1,
- "max_instances": 1,
- "cloud_env": {
- "cloud_env_id": 11,
- "name": "test",
- "cloud_provider": {
- "cloud_provider_id": 0,
- "name": "GCP"
}
}, - "cluster_conf": {
- "cluster_conf_id": 10,
- "cluster_conf_name": "n1-standard-4",
- "machine_type_category": "general_purpose",
- "machine_type": {
- "machine_type_id": 10,
- "name": "n1-standard-4",
- "vCPUs": 4,
- "memory": "15 GiB",
- "has_cuda": false,
- "gpu_model": null,
- "gpus": 0,
- "gpu_memory": null,
- "cpu_model": [
- "Intel Xeon Scalable (Skylake) 1st Generation",
- "Intel Xeon E5 v4 (Broadwell E5)",
- "Intel Xeon E5 v3 (Haswell)",
- "Intel Xeon E5 v2 (Ivy Bridge)",
- "Intel Xeon E5 (Sandy Bridge)"
], - "cpu_min_frequency_GHz": [
- "2.0",
- "2.2",
- "2.3",
- "2.5",
- "2.6"
], - "cpu_max_frequency_GHz": [
- "3.5",
- "3.7",
- "3.8",
- "3.5",
- "3.6"
], - "has_local_disk": false,
- "local_disk_size_GB": null,
- "local_num_of_disks": null,
- "local_disk_throughput_MB": null,
- "machine_price_ycu": 2.5
}, - "machine_volume_conf": {
- "volume_conf_id": 2,
- "name": "volume_gcp_2",
- "size": 375,
- "machine_volume_num": 2,
- "machine_volume_strip_num": 2
}
}, - "spark_infra_version": {
- "spark_infra_version_id": 1,
- "spark_docker_image_name": "v3.2.2-28",
- "spark_version": "3.2.2",
- "hive_version": "2.3.9",
- "hadoop_version": "3.2.4",
- "scala_version": "2.12.15",
- "python_version": "3.9.5",
- "notebook_support": true,
- "has_cuda_support": true,
- "thrift_support": false,
- "yeedu_functions_support": true
}, - "engine_cluster_spark_config": {
- "max_parallel_spark_job_execution_per_instance": 5,
- "num_of_workers": null
}, - "metastore_catalog_id": 1
}
}, - "workflow_job_instance_details": {
- "workflow_job_instance_status": {
- "workflow_job_instance_id": 356,
- "workflow_job_id": 356,
- "status": "DONE",
- "from_date": "2024-06-21T09:13:47.789222+00:00",
- "to_date": "2024-06-21T09:20:26.56676+00:00"
}, - "workflow_execution_process": {
- "machine_pid_number": "1964",
- "machine_hostname": "yeedu13-c9b7944b-b5c1-33ad-6055-78cc47b1769a",
- "machine_id": "c9b7944b-b5c1-33ad-6055-78cc47b1769a",
- "machine_pid_user": "root",
- "machine_node_number": "0"
}
}, - "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-21T09:13:47.789222+00:00",
- "from_date": "2024-06-21T09:13:47.789222+00:00",
- "to_date": "2024-06-21T09:20:26.56676+00:00"
}Start or get the kernel status of a notebook run.
Start or retrieve the kernel status for a notebook run, provided the instance is in the RUNNING state.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the notebook run for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "run_id": 301,
- "kernel_info": {
- "kernel_id": "d08a7416-6aa5-4a92-90db-b8257c324db5",
- "kernel_status": "starting"
}, - "session_id": "a708b294-8c6a-4ab0-b14f-dbea19bbf2ac"
}Create a WebSocket proxy for a notebook run.
Creates a proxy to establish a WebSocket connection for a specific notebook run, provided the instance is in the RUNNING state.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the notebook run for filtering. |
query Parameters
| yeedu_session required | string Specifies the session token for Yeedu authentication. |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "message": "Created Proxy for Notebook run Id: 108."
}Get the kernel status for a specific notebook run.
Retrieves the kernel status of a notebook run, provided the instance is in the RUNNING state.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the notebook run for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "kernel_id": "86d10c83-95a6-4fa7-9402-0bc752b925fe,",
- "session_id": "a708b294-8c6a-4ab0-b14f-dbea19bbf2ac",
- "language_name": "python3",
- "kernel_status": "starting",
- "active_connections": 0,
- "last_activity": "2023-08-17T08:02:33.518253Z"
}Interrupt a notebook run kernel.
Interrupts the kernel for a specific notebook run, provided the instance is in the RUNNING state.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the notebook run for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "run_id": 301,
- "kernel_info": {
- "kernel_id": "d08a7416-6aa5-4a92-90db-b8257c324db5",
- "kernel_status": "starting"
}, - "session_id": "a708b294-8c6a-4ab0-b14f-dbea19bbf2ac"
}Restart a notebook run kernel.
Restarts the kernel for a specific notebook run, provided the instance is in the RUNNING state.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the notebook run for filtering. |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "run_id": 301,
- "kernel_info": {
- "kernel_id": "d08a7416-6aa5-4a92-90db-b8257c324db5",
- "kernel_status": "busy"
}
}Stop a notebook run.
A notebook run can only be stopped if its status is SUBMITTED or RUNNING.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the notebook run to stop. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "SparkKill": {
- "workflow_job_id": 354,
- "workflow_job_instance_id": 354,
- "spark_job_instance_id": 298,
- "spark_job_id": 23,
- "compute_engine_id": 13
}
}Download notebook run logs.
Downloads logs of a notebook run by specifying the run ID and log type.
- Provide
last_n_linesto fetch only the last N lines of the log. - Provide
file_size_bytesto download only the first N bytes of the log.
If neither option is specified, the complete log file will be downloaded.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the notebook run for filtering. |
| log_type required | string Enum: "stdout" "stderr" The type of log file to filter. |
query Parameters
| last_n_lines | integer <int32> [ 1 .. 1000 ] Number of lines to retrieve from the end of the log file (sample preview). |
| file_size_bytes | integer <int64> >= 1 Number of bytes to preview from the beginning of the log file (sample preview). |
Responses
Response samples
- 400
- 401
- 404
- 500
{- "error_code": "string",
- "error_message": "string"
}Get workflow errors of a notebook run.
Fetch paginated lists of workflow execution errors encountered by a specific notebook run, useful for troubleshooting complex notebook workflows.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
| run_id required | integer <int64> Specifies the ID of the notebook run for filtering. |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "error": "OnError(Process finish with exit code 137,None,None,None,Some(akka.stream.alpakka.amqp.impl.AmqpSourceStage$$anon$1$$anon$2$$anon$3@47b8c55f))"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 10
}
}Get all filter data for Notebook runs within a workspace.
Retrieves Notebook runs data, including:
- Engine cluster instances used by Notebooks in the workspace.
- Notebooks executed within the workspace.
- Users who created the Notebook runs.
- Users who modified the Notebook runs.
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| filter_type required | string Enum: "cluster" "notebook_conf" "created_by_user" "modified_by_user" Specifies the filter type for the data to retrieve. Choose one of the following:
|
| notebook_id | integer <int64> Specifies the ID of the notebook. |
| name | string Specifies the name to filter by, applicable to the chosen filter type. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "cluster_id": 1,
- "cluster_name": "azure_instance"
}
]
}Create a new widget in a notebook.
Creates a widget in notebook with specified fields
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook configuration to add widget. |
| notebook_name | string Specifies the name of the notebook configuration to add widget. |
Request Body schema: application/jsonrequired
Notebook Widget to be created.
| name required | string [ 1 .. 1024 ] characters |
| default_value required | string |
| label | string or null <= 2048 characters |
| widget_type required | string Enum: "text" "dropdown" "multiselect" "combobox" |
| choices | Array of strings [ 1 .. 1023 ] items unique Required only for widgetType |
Responses
Request samples
- Payload
{- "name": "param_1",
- "default_value": "submitted",
- "label": "Enter the status of job",
- "widget_type": "text"
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "name": "param_1",
- "defaultValue": "running",
- "label": "Enter the status of job",
- "widgetType": "text",
- "currentValue": "submitted"
}Get all the Widgets of a notebook.
Retrieves a list of widgets of a notebook
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook configuration for filtering. |
| notebook_name | string Specifies the name of the notebook configuration for filtering |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "widgets": {
- "param_2": {
- "currentValue": "",
- "nuid": "cf95595f-1b73-45ee-8419-7557b0d3cd7d",
- "typedWidgetInfo": {
- "autoCreated": false,
- "defaultValue": "a",
- "label": null,
- "name": "param_2",
- "options": {
- "widgetDisplayType": "Dropdown",
- "choices": [
- "a",
- "b",
- "c",
- " "
], - "fixedDomain": true,
- "multiselect": true
}, - "parameterDataType": "String"
}, - "widgetInfo": {
- "widgetType": "multiselect",
- "defaultValue": "a",
- "label": null,
- "name": "param_2",
- "options": {
- "widgetType": "dropdown",
- "autoCreated": false,
- "choices": [
- "a",
- "b",
- "c",
- " "
]
}
}
}, - "param_1": {
- "currentValue": "status",
- "nuid": "532a0de5-53a2-49b3-b14c-662ad1a8e5dd",
- "typedWidgetInfo": {
- "autoCreated": false,
- "defaultValue": "status",
- "label": null,
- "name": "param_1",
- "options": {
- "widgetDisplayType": "Text",
- "validationRegex": null
}, - "parameterDataType": "String"
}, - "widgetInfo": {
- "widgetType": "text",
- "defaultValue": "status",
- "label": null,
- "name": "param_1",
- "options": {
- "widgetType": "text",
- "autoCreated": false,
- "validationRegex": null
}
}
}
}
}Delete all the Widgets of a notebook.
Removes all widgets of a notebook
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook configuration used for deletion |
| notebook_name | string Specifies the name of the notebook configuration used for deletion |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "message": "Deleted all notebook widgets in notebook_conf_id: 1."
}Get details of specific widget for a notebook
Retrieve specific widget details filtered by Name
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook configuration for filtering. |
| notebook_name | string Specifies the name of the notebook configuration for filtering. |
| widget_name required | string Specifies the name of the widget for filtering |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "param_1": {
- "currentValue": "status",
- "nuid": "532a0de5-53a2-49b3-b14c-662ad1a8e5dd",
- "typedWidgetInfo": {
- "autoCreated": false,
- "defaultValue": "status",
- "label": null,
- "name": "param_1",
- "options": {
- "widgetDisplayType": "Text",
- "validationRegex": null
}, - "parameterDataType": "String"
}, - "widgetInfo": {
- "widgetType": "text",
- "defaultValue": "status",
- "label": null,
- "name": "param_1",
- "options": {
- "widgetType": "text",
- "autoCreated": false,
- "validationRegex": null
}
}
}
}Delete specified Widgets of a notebook.
Removes the specified widget of a notebook
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook configuration used for deletion |
| notebook_name | string Specifies the name of the notebook configuration used for deletion |
| widget_name required | string Specifies name of the widget to be deleted |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "message": "Deleted widgets: text_widget in notebook_conf_id: 1."
}Update the fields of specific widget for a notebook
Updates the fields of specific widget for a notebook
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook configuration to update widget |
| notebook_name | string Specifies the name of the notebook configuration to update widget |
| widget_name required | string Specifies the name of the widget to be updated |
Request Body schema: application/jsonrequired
Widget fileds to be modified.
| name | string [ 1 .. 1024 ] characters |
| default_value | string |
| label | string or null <= 2048 characters |
| widget_type | string Enum: "text" "dropdown" "multiselect" "combobox" |
| choices | Array of strings [ 1 .. 1023 ] items unique valid only for widgetType |
Responses
Request samples
- Payload
{- "name": "text_widget",
- "default_value": "aws",
- "label": "Enter the cloud environment"
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "name": "param_1",
- "defaultValue": "running",
- "label": "Enter the status of job",
- "widgetType": "text",
- "currentValue": "submitted"
}Get value of specific widget for a notebook
Retrieve specific widget value filtered by Name
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook configuration for filtering. |
| notebook_name | string Specifies the name of the notebook configuration for filtering. |
| widget_name required | string Specifies the name of the widget for filtering |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "value": "aws"
}Update the value of widget in a notebook
Updates the value of a specified widget of a notebook
Authorizations:
path Parameters
| workspace_id required | integer <int64> Specifies the ID of the workspace for filtering. |
query Parameters
| notebook_id | integer <int64> Specifies the ID of the notebook configuration to update widget value |
| notebook_name | string Specifies the name of the notebook configuration to update widget value |
| widget_name required | string Specifies name of the widget which value to be updated |
Request Body schema: application/jsonrequired
Widget value to be modified.
| value required | string |
Responses
Request samples
- Payload
{- "value": "aws"
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "name": "param_1",
- "defaultValue": "running",
- "label": "Enter the status of job",
- "widgetType": "text",
- "currentValue": "submitted"
}Get an XCom entry
Retrieves the value of an XCom entry for a specific task in a given DAG run.
Authorizations:
path Parameters
| dag_id required | string The DAG identifier. |
| dag_run_id required | string The DAG run identifier. |
| task_id required | string The task identifier within the DAG. |
| key required | string The XCom key. |
query Parameters
| map_index | integer Default: -1 The map index for XCom entries that are part of a mapped task. |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "dag_id": "test_xcom_dag",
- "execution_date": "2025-08-17T09:41:17.892228+00:00",
- "key": "my_key",
- "map_index": -1,
- "task_id": "push_task",
- "timestamp": "2025-08-17T09:41:21.653500+00:00",
- "value": "hello_xcom"
}Create an XCom entry
Pushes a new XCom entry (key/value) for a specific task in a given DAG run.
Authorizations:
path Parameters
| dag_id required | string The DAG identifier. |
| dag_run_id required | string The DAG run identifier. |
| task_id required | string The task identifier within the DAG. |
Request Body schema: application/jsonrequired
The XCom entry to be created.
| key required | string The XCom key. |
| value required | string The XCom value (serialized as JSON). |
| map_index | integer Default: -1 The optional map index. Defaults to -1. |
Responses
Request samples
- Payload
{- "key": "result",
- "value": "res_value",
- "map_index": -1
}Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "dag_id": "test_xcom_dag",
- "run_id": "manual__2025-08-17T09:41:17.892228+00:00",
- "task_id": "push_task",
- "key": "my_key",
- "value": "hello_xcom",
- "map_index": -1,
- "timestamp": "2025-08-17T09:41:21.653500+00:00",
- "logical_date": "2025-08-17T09:41:17.892228+00:00"
}Get Latest DAG Run
Retrieves the most recent DAG Run for the given DAG.
Authorizations:
path Parameters
| dag_id required | string The DAG identifier. |
Responses
Response samples
- 200
- 401
- 403
- 404
- 500
{- "dag_run_id": "manual__2025-08-18T13:22:25.052776+00:00",
- "dag_id": "test_dag_id",
- "logical_date": "2025-08-18T13:22:22.899000Z",
- "queued_at": "2025-08-18T13:22:25.064534Z",
- "start_date": "2025-08-18T13:22:25.546564Z",
- "end_date": "2025-08-18T13:22:27.753694Z",
- "data_interval_start": "2025-08-18T13:22:22.899000Z",
- "data_interval_end": "2025-08-18T13:22:22.899000Z",
- "run_after": "2025-08-18T13:22:22.899000Z",
- "last_scheduling_decision": "2025-08-18T13:22:27.750673Z",
- "run_type": "manual",
- "state": "success",
- "triggered_by": "rest_api",
- "conf": { },
- "note": null,
- "dag_versions": [
- {
- "id": "0198bd57-ce05-76ef-8d0a-195a2b55deff",
- "version_number": 1,
- "dag_id": "test_dag_id",
- "bundle_name": "dags-folder",
- "bundle_version": null,
- "created_at": "2025-08-18T13:21:44.965295Z",
- "bundle_url": null
}
], - "bundle_version": null
}Get a DAG Run
Retrieve details of a specific DAG Run for a given DAG.
Authorizations:
path Parameters
| dag_id required | string The DAG identifier. |
| dag_run_id required | string The DAG Run identifier. |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 500
{- "dag_run_id": "manual__2025-08-18T13:22:25.052776+00:00",
- "dag_id": "test_dag_id",
- "logical_date": "2025-08-18T13:22:22.899000Z",
- "queued_at": "2025-08-18T13:22:25.064534Z",
- "start_date": "2025-08-18T13:22:25.546564Z",
- "end_date": "2025-08-18T13:22:27.753694Z",
- "data_interval_start": "2025-08-18T13:22:22.899000Z",
- "data_interval_end": "2025-08-18T13:22:22.899000Z",
- "run_after": "2025-08-18T13:22:22.899000Z",
- "last_scheduling_decision": "2025-08-18T13:22:27.750673Z",
- "run_type": "manual",
- "state": "success",
- "triggered_by": "rest_api",
- "conf": { },
- "note": null,
- "dag_versions": [
- {
- "id": "0198bd57-ce05-76ef-8d0a-195a2b55deff",
- "version_number": 1,
- "dag_id": "test_dag_id",
- "bundle_name": "dags-folder",
- "bundle_version": null,
- "created_at": "2025-08-18T13:21:44.965295Z",
- "bundle_url": null
}
], - "bundle_version": null
}Get all the billed tenants.
Retrieves a list of tenants for whom billing records exist, filtered by billing type. This endpoint helps in managing and viewing tenant-specific billing data.
Authorizations:
query Parameters
| billing required | string Default: "invoice" Enum: "invoice" "usage" Specifies the billing type to be used as a filter. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "tenant_id": "34f5c98e-f430-457b-a812-92637d0c6fd0",
- "tenant_name": "string"
}
]Get all the billed date range.
Retrieves the list of billing periods (date ranges) associated with tenants, filtered by billing type and tenant list. Useful for identifying the available billing timeframes for reporting.
Authorizations:
query Parameters
| billing required | string Default: "invoice" Enum: "invoice" "usage" Specifies the billing type to be used as a filter. |
| tenant_ids | Array of strings <uuid> [ items <uuid > ] Specifies the tenant IDs to be used as a filter. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "months": [
- "2019-08-24"
]
}Get all the billed clusters.
Retrieves a list of compute clusters that have billing records, with filtering by billing type, tenant(s), and cloud provider(s). Helps in understanding cluster usage and cost attribution.
Authorizations:
query Parameters
| billing required | string Default: "invoice" Enum: "invoice" "usage" Specifies the billing type to be used as a filter. |
| tenant_ids | Array of strings <uuid> [ items <uuid > ] Specifies the tenant IDs to be used as a filter. |
| cloud_providers | Array of strings Items Enum: "GCP" "AWS" "Azure" Specifies the cloud providers to be used as a filter. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "cluster_id": 1,
- "cluster_name": "yeedu_cluster",
- "tenant_id": "34f5c98e-f430-457b-a812-92637d0c6fd0",
- "tenant_name": "test_tenant",
- "cloud_provider_id": 0,
- "cloud_provider_name": "GCP"
}
]Get all the billed machine types.
Retrieves the types of machines billed within specified tenants, cloud providers, and clusters, filtered by billing type. Useful for detailed cost analysis by infrastructure type.
Authorizations:
query Parameters
| billing required | string Default: "invoice" Enum: "invoice" "usage" Specifies the billing type to be used as a filter. |
| tenant_ids | Array of strings <uuid> [ items <uuid > ] Specifies the tenant IDs to be used as a filter. |
| cloud_providers | Array of strings Items Enum: "GCP" "AWS" "Azure" Specifies the cloud providers to be used as a filter. |
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "machine_type_id": 1,
- "machine_type_name": "n1-highcpu-16"
}
]Get all the cluster labels for billing.
Retrieves cluster labels (tags or metadata) used in billing, with filtering by billing type, tenants, cloud providers, and cluster instances. Helps categorize and organize billing data for clusters.
Authorizations:
query Parameters
| billing required | string Default: "invoice" Enum: "invoice" "usage" Specifies the billing type to be used as a filter. |
| tenant_ids | Array of strings <uuid> [ items <uuid > ] Specifies the tenant IDs to be used as a filter. |
| cloud_providers | Array of strings Items Enum: "GCP" "AWS" "Azure" Specifies the cloud providers to be used as a filter. |
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "key1": [
- "value1",
- "value2",
- "value3"
], - "key2": [
- "value1",
- "value2",
- "value3"
]
}Get the billing invoice data
Retrieves a list of billed invoices within a specified date range and filtered by tenants, cloud providers, clusters, machine types, and labels. Enables detailed invoice-level billing review and reconciliation.
Authorizations:
query Parameters
| tenant_ids | Array of strings <uuid> [ items <uuid > ] Specifies the tenant IDs to be used as a filter. |
| start_month_year required | string <date> Specifies the starting month & year (YYYY-MM-DD) to be used as a filter. |
| end_month_year required | string <date> Specifies the ending month & year (YYYY-MM-DD) to be used as a filter. |
| cloud_providers | Array of strings Items Enum: "GCP" "AWS" "Azure" Specifies the cloud providers to be used as a filter. |
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
| machine_type_ids | Array of integers <int64> unique [ items <int64 > ] Specifies the machine type IDs to be used as a filter. |
| labels | object Specifies the labels to be used as a filter. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "invoice": [
- {
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd",
- "tenant_name": "test_billing",
- "cluster_id": 2,
- "cluster_name": "yeedu_gcp_instance",
- "cluster_labels": {
- "env": "test_billing",
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd"
}, - "cloud_provider_id": 0,
- "cloud_provider_name": "GCP",
- "machine_type_id": 10,
- "machine_type": "n1-standard-4",
- "machine_price_ycu_hour": 2.5,
- "total_monthly_minutes": 236,
- "total_monthly_ycu": 9.833333333333334,
- "checkpoint_month": "2024-04-30T18:30:00+00:00",
- "from_date": "2024-05-30T10:23:44.952719+00:00",
- "to_date": "infinity"
}, - {
- "tenant_id": "c5b2ade3-2385-4dbe-95d7-1fb4c373dc44",
- "tenant_name": "ui_dev_tenant",
- "cluster_id": 1,
- "cluster_name": "gcp_test_cluster",
- "cluster_labels": {
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "c5b2ade3-2385-4dbe-95d7-1fb4c373dc44"
}, - "cloud_provider_id": 0,
- "cloud_provider_name": "GCP",
- "machine_type_id": 10,
- "machine_type": "n1-standard-4",
- "machine_price_ycu_hour": 2.5,
- "total_monthly_minutes": 440,
- "total_monthly_ycu": 18.333333333333332,
- "checkpoint_month": "2024-04-30T18:30:00+00:00",
- "from_date": "2024-05-30T10:23:44.952719+00:00",
- "to_date": "infinity"
}, - {
- "tenant_id": "c5b2ade3-2385-4dbe-95d7-1fb4c373dc44",
- "tenant_name": "ui_dev_tenant",
- "cluster_id": 1,
- "cluster_name": "gcp_test_cluster",
- "cluster_labels": {
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "c5b2ade3-2385-4dbe-95d7-1fb4c373dc44"
}, - "cloud_provider_id": 0,
- "cloud_provider_name": "GCP",
- "machine_type_id": 10,
- "machine_type": "n1-standard-4",
- "machine_price_ycu_hour": 2.5,
- "total_monthly_minutes": 86,
- "total_monthly_ycu": 3.583333333333333,
- "checkpoint_month": "2024-05-01T00:00:00+00:00",
- "from_date": "2024-05-30T10:23:44.952719+00:00",
- "to_date": "infinity"
}, - {
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd",
- "tenant_name": "test_billing",
- "cluster_id": 3,
- "cluster_name": "yeedu_aws_instance",
- "cluster_labels": {
- "env": "billing_test",
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd"
}, - "cloud_provider_id": 1,
- "cloud_provider_name": "AWS",
- "machine_type_id": 89,
- "machine_type": "m5d.xlarge",
- "machine_price_ycu_hour": 2.6,
- "total_monthly_minutes": 234,
- "total_monthly_ycu": 10.14,
- "checkpoint_month": "2024-04-30T18:30:00+00:00",
- "from_date": "2024-05-30T10:23:44.952719+00:00",
- "to_date": "infinity"
}, - {
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd",
- "tenant_name": "test_billing",
- "cluster_id": 4,
- "cluster_name": "yeedu_azure_instance",
- "cluster_labels": {
- "test": "billing",
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd"
}, - "cloud_provider_id": 2,
- "cloud_provider_name": "Azure",
- "machine_type_id": 206,
- "machine_type": "Standard_D4d_v5",
- "machine_price_ycu_hour": 2.6,
- "total_monthly_minutes": 3,
- "total_monthly_ycu": 0.13,
- "checkpoint_month": "2024-04-30T18:30:00+00:00",
- "from_date": "2024-05-30T10:23:44.952719+00:00",
- "to_date": "infinity"
}, - {
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd",
- "tenant_name": "test_billing",
- "cluster_id": 5,
- "cluster_name": "azure_test",
- "cluster_labels": {
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd"
}, - "cloud_provider_id": 2,
- "cloud_provider_name": "Azure",
- "machine_type_id": 206,
- "machine_type": "Standard_D4d_v5",
- "machine_price_ycu_hour": 2.6,
- "total_monthly_minutes": 3,
- "total_monthly_ycu": 0.13,
- "checkpoint_month": "2024-04-30T18:30:00+00:00",
- "from_date": "2024-05-30T10:23:44.952719+00:00",
- "to_date": "infinity"
}
], - "invoice_overview": {
- "overall_total_minutes": 1002,
- "overall_total_ycu": 42.15,
- "total_clusters_count": 5,
- "total_tenants": 2
}
}Get the billing usage data
Retrieves usage-based billing data for a specified date range, with optional filters for tenants, cloud providers, clusters, machine types, and labels. Supports tracking of resource consumption for cost allocation and forecasting.
Authorizations:
query Parameters
| tenant_ids | Array of strings <uuid> [ items <uuid > ] Specifies the tenant IDs to be used as a filter. |
| start_date required | string <date> Specifies the starting date (YYYY-MM-DD) to be used as a filter. |
| end_date required | string <date> Specifies the ending date (YYYY-MM-DD) to be used as a filter. |
| cloud_providers | Array of strings Items Enum: "GCP" "AWS" "Azure" Specifies the cloud providers to be used as a filter. |
| cluster_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of cluster instance IDs to filter on. |
| machine_type_ids | Array of integers <int64> unique [ items <int64 > ] An optional set of machine type IDs to filter on. |
| labels | object Specifies the labels to be used as a filter. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "usage": [
- {
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd",
- "tenant_name": "test_billing",
- "cluster_id": 2,
- "cluster_name": "yeedu_gcp_instance",
- "cluster_labels": {
- "env": "test_billing",
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd"
}, - "cloud_provider_id": 0,
- "cloud_provider_name": "GCP",
- "machine_type_id": 10,
- "machine_type": "n1-standard-4",
- "machine_price_ycu_hour": 2.5,
- "total_daily_minutes": 236,
- "total_daily_ycu": 9.833333333333334,
- "checkpoint_day": "2024-05-29T18:30:00+00:00",
- "from_date": "2024-05-30T10:23:47.447166+00:00",
- "to_date": "infinity"
}, - {
- "tenant_id": "c5b2ade3-2385-4dbe-95d7-1fb4c373dc44",
- "tenant_name": "ui_dev_tenant",
- "cluster_id": 1,
- "cluster_name": "gcp_test_cluster",
- "cluster_labels": {
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "c5b2ade3-2385-4dbe-95d7-1fb4c373dc44"
}, - "cloud_provider_id": 0,
- "cloud_provider_name": "GCP",
- "machine_type_id": 10,
- "machine_type": "n1-standard-4",
- "machine_price_ycu_hour": 2.5,
- "total_daily_minutes": 440,
- "total_daily_ycu": 18.333333333333332,
- "checkpoint_day": "2024-05-29T18:30:00+00:00",
- "from_date": "2024-05-30T10:23:47.447166+00:00",
- "to_date": "infinity"
}, - {
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd",
- "tenant_name": "test_billing",
- "cluster_id": 3,
- "cluster_name": "yeedu_aws_instance",
- "cluster_labels": {
- "env": "billing_test",
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd"
}, - "cloud_provider_id": 1,
- "cloud_provider_name": "AWS",
- "machine_type_id": 89,
- "machine_type": "m5d.xlarge",
- "machine_price_ycu_hour": 2.6,
- "total_daily_minutes": 234,
- "total_daily_ycu": 10.14,
- "checkpoint_day": "2024-05-29T18:30:00+00:00",
- "from_date": "2024-05-30T10:23:47.447166+00:00",
- "to_date": "infinity"
}, - {
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd",
- "tenant_name": "test_billing",
- "cluster_id": 4,
- "cluster_name": "yeedu_azure_instance",
- "cluster_labels": {
- "test": "billing",
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd"
}, - "cloud_provider_id": 2,
- "cloud_provider_name": "Azure",
- "machine_type_id": 206,
- "machine_type": "Standard_D4d_v5",
- "machine_price_ycu_hour": 2.6,
- "total_daily_minutes": 3,
- "total_daily_ycu": 0.13,
- "checkpoint_day": "2024-05-29T18:30:00+00:00",
- "from_date": "2024-05-30T10:23:47.447166+00:00",
- "to_date": "infinity"
}, - {
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd",
- "tenant_name": "test_billing",
- "cluster_id": 5,
- "cluster_name": "azure_test",
- "cluster_labels": {
- "resource": "yeedu",
- "vm": "yeedu_node",
- "tenant_id": "d75e0352-bf43-4e9b-9b2f-8b5e924854fd"
}, - "cloud_provider_id": 2,
- "cloud_provider_name": "Azure",
- "machine_type_id": 206,
- "machine_type": "Standard_D4d_v5",
- "machine_price_ycu_hour": 2.6,
- "total_daily_minutes": 3,
- "total_daily_ycu": 0.13,
- "checkpoint_day": "2024-05-29T18:30:00+00:00",
- "from_date": "2024-05-30T10:23:47.447166+00:00",
- "to_date": "infinity"
}
], - "usage_overview": {
- "overall_total_minutes": 916,
- "overall_total_ycu": 38.56666666666667,
- "total_clusters_count": 5,
- "total_tenants": 2
}
}startConversation
Creates a new conversation that expires after 7 days of inactivity.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 500
{- "conversation_id": "cc71b11a-25cd-4c2d-9950-df2cc38e3407",
- "created_at": "2019-08-24T14:15:22Z",
- "user_id": "string"
}sendMessage
Sends a message to the assistant in an existing conversation.
Authorizations:
path Parameters
| conversation_id required | string <uuid> |
Request Body schema: application/jsonrequired
| message required | string The message to send to the assistant |
| model_name | string The LLM model that shoud be used in the assistant. |
| time_zone | string Default: "UTC" User's timezone to be considered by the assistant. |
object (AssistantContext) Context that should be provided to the assistant. |
Responses
Request samples
- Payload
{- "message": "string",
- "model_name": "string",
- "time_zone": "UTC",
- "context": {
- "workspace_id": "string",
- "notebook_id": "string",
- "notebook_file_id": "string",
- "notebook_name": "string",
- "notebook_type": "sql",
- "operation": "auto_complete",
- "current_cell_id": "fb39937c-82d3-421a-8936-0b863227ff48",
- "error_msg": "string",
- "current_cell": [
- "string"
], - "cell_output": "string",
- "metastore_catalog_id": "string",
- "recent_queries": [
- "string"
], - "current_notebook_code": [
- { }
]
}
}Response samples
- 400
- 401
- 403
- 404
- 500
{- "error_code": "string",
- "error_message": "string"
}getConversationHistory
Retrieves the message history for a conversation.
Authorizations:
path Parameters
| conversation_id required | string <uuid> |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "conversation_id": "cc71b11a-25cd-4c2d-9950-df2cc38e3407",
- "messages": [
- {
- "message_id": "d7d9d9fd-478f-40e6-b651-49b7f19878a2",
- "role": "user",
- "content": "string",
- "timestamp": "2019-08-24T14:15:22Z",
- "tools_used": [
- "string"
], - "context": {
- "notebook_id": "string",
- "workspace_id": "string",
- "notebook_name": "string"
}
}
], - "total_messages": 0,
- "created_at": "2019-08-24T14:15:22Z"
}listConversations
Lists all conversations for the authenticated user.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "conversation_id": "cc71b11a-25cd-4c2d-9950-df2cc38e3407",
- "created_at": "2019-08-24T14:15:22Z",
- "last_message_at": "2019-08-24T14:15:22Z",
- "message_count": 0,
- "preview": "string"
}
]renameConversationTitle
Renames the title of the conversation.
Authorizations:
path Parameters
| conversation_id required | string <uuid> |
Request Body schema: application/jsonrequired
Conversation title to be renamed.
| title required | string non-empty |
Responses
Request samples
- Payload
{- "title": "string"
}Response samples
- 200
- 400
- 401
- 404
- 500
{- "title": "string"
}deleteConversation
Deletes a conversation and all its messages.
Authorizations:
path Parameters
| conversation_id required | string <uuid> |
Responses
Response samples
- 200
- 401
- 404
{- "status": "deleted",
- "conversation_id": "cc71b11a-25cd-4c2d-9950-df2cc38e3407",
- "messages_deleted": 0,
- "deleted_at": "2019-08-24T14:15:22Z"
}Generate yeedu session token
Generates a session token for authenticated access by validating provided login credentials. This token grants access to protected resources within the platform.
Request Body schema: application/jsonrequired
Login details to generate session token
| username | string non-empty Required for 'LDAP' or 'AAD' auth_type. |
| password | string <password> non-empty Required for 'LDAP' or 'AAD' auth_type. |
| timeout | string non-empty Default: "48h" Defaults to 48 hours if not provided. |
| auth_type required | string Enum: "LDAP" "AAD" "AZURE_SSO" Use the '/login/auth_type/' API to get the supported auth_type. |
| redirect_url | string or null non-empty Optional. Used when the client requires a successful 'AZURE_SSO' login callback to a specified redirection URL. |
Responses
Request samples
- Payload
{- "username": "USER",
- "password": "PASS",
- "timeout": "48h",
- "auth_type": "LDAP",
}Response samples
- 200
- 400
- 404
- 500
{- "token": "<JWT_TOKEN>"
}Get all the User Tenants
All Tenant Ids available for a specific user are returned in the form of JSON
Authorizations:
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "tenant_id": "ffafef39-1948-4cd2-81b0-8243eadf6f75",
- "name": "manual_ui_test",
- "description": null
}, - {
- "tenant_id": "59f0724b-d711-47ab-8010-497f1f715735",
- "name": "integration_test_tenant",
- "description": "Creating_tenant_integration_test_tenant"
}, - {
- "tenant_id": "cf1f945f-01ce-4ac6-a070-8c733f2fa791",
- "name": "ui_testing",
- "description": "creating tenant ui_testing"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 1,
- "limit": 100
}
}Search tenant across the user tenants
All Tenant available for a specific user are returned based on the tenant name
Authorizations:
query Parameters
| tenant_name required | string The tenant name that will be used for filtering |
| pageNumber | integer <int32> Default: 1 The page number from which items will return |
| limit | integer <int32> Default: 100 The numbers of items to return |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "tenant_id": "ffafef39-1948-4cd2-81b0-8243eadf6f75",
- "name": "manual_ui_test",
- "description": null
}, - {
- "tenant_id": "59f0724b-d711-47ab-8010-497f1f715735",
- "name": "integration_test_tenant",
- "description": "Creating_tenant_integration_test_tenant"
}, - {
- "tenant_id": "cf1f945f-01ce-4ac6-a070-8c733f2fa791",
- "name": "ui_testing",
- "description": "creating tenant ui_testing"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 1,
- "limit": 100
}
}Associate a tenant Id with the current session.
Associate the tenant Id with the session token.
Authorizations:
path Parameters
| tenant_id required | string <uuid> Tenant Id that will be used to associate |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Successfully associated Tenant"
}Get all the User Information
User information for a specific user are returned in the form of JSON
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "user_id": "2",
- "tenant_id": "be7da193-7c44-429a-8631-8a60567ae6a2",
- "username": "USER",
- "email": "user@yeedu.io",
- "group_info": [
- {
- "group_id": 1,
- "groupname": "Data_Engineering",
- "group_type": "Unified"
}, - {
- "group_id": 2,
- "groupname": "Data_Curation",
- "group_type": "DynamicMembership"
}
], - "from_date": "2024-06-14T10:11:02.278Z",
- "to_date": null
}Get User Roles for a specific session
User roles are returned across all tenants if the tenant_id is not provided.
Authorizations:
query Parameters
| tenant_id | string <uuid> Tenant ID that will be used to get the user roles within it. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "user_id": 4,
- "username": "ru0000-yeedu@yeedu.io",
- "email": "ru0000-yeedu@yeedu.io",
- "user_roles": [
- {
- "roles_id": 0,
- "user_role": "User"
}
], - "group_roles": null,
- "from_date": "2024-06-14T10:11:02.278675+00:00",
- "to_date": "infinity"
}List all user tokens
Lists all tokens generated by the logged-in user.
Authorizations:
query Parameters
| pageNumber | integer <int32> Default: 1 The page number from which items will return |
| limit | integer <int32> Default: 100 The numbers of items to return |
Responses
Response samples
- 200
- 400
- 401
- 500
{- "data": [
- {
- "token_id": 26,
- "description": "string",
- "tenant_id": "8cee6100-7086-4138-92fd-712046174e91",
- "created_by": {
- "user_id": 1,
- "username": "test@yeedu.com"
}, - "modified_by": {
- "user_id": 1,
- "username": "test@yeedu.com"
}, - "expiry_time": "infinity"
}, - {
- "token_id": 25,
- "description": "string",
- "tenant_id": "8cee6100-7086-4138-92fd-712046174e91",
- "created_by": {
- "user_id": 1,
- "username": "test@yeedu.com"
}, - "modified_by": {
- "user_id": 1,
- "username": "test@yeedu.com"
}, - "expiry_time": "2025-03-06T06:16:27.396+00:00"
}, - {
- "token_id": 24,
- "description": "string",
- "tenant_id": "8cee6100-7086-4138-92fd-712046174e91",
- "created_by": {
- "user_id": 1,
- "username": "test@yeedu.com"
}, - "modified_by": {
- "user_id": 1,
- "username": "test@yeedu.com"
}, - "expiry_time": "2025-03-05T18:56:27.994+00:00"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 1,
- "limit": 100
}
}Generate a new user token
Generate a new token for the user with a specific tenant.
Authorizations:
Request Body schema: application/jsonrequired
| description | string |
| timeout | string non-empty Default: "30 days" Defaults to 30 days if not provided. |
Responses
Request samples
- Payload
{- "description": "string",
- "timeout": "30 days"
}Response samples
- 201
- 400
- 401
- 500
{- "token": "<JWT_TOKEN>"
}Get all the user secrets.
Retrieves a list of user secrets created by the user of the current session.
Authorizations:
query Parameters
| secret_type | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of secret to filter the secrets. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
| user_secret_id | string Filter secrets by a specific user secret ID. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "user_secret_id": 9,
- "description": "Neo4j crdentials",
- "env_var_secret": "secretKeyName",
- "created_by": {
- "user_id": 3,
- "username": "rm0000@yeedu.io"
}, - "last_update_date": "2025-03-05T13:12:31.659431+00:00",
- "from_date": "2025-03-05T13:12:31.659431+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 3,
- "limit": 1,
- "next_page": 2
}
}Search all the user secrets created by a user based on secret name.
Retrieves a list of searched user secrets created by the user of the current session by secret name.
Authorizations:
query Parameters
| secret_name required | string Secret name that will be used for filter |
| secret_type | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of secret to filter the secrets. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "user_secret_id": 9,
- "description": "Neo4j crdentials",
- "env_var_secret": "secretKeyName",
- "created_by": {
- "user_id": 3,
- "username": "rm0000@yeedu.io"
}, - "last_update_date": "2025-03-05T13:12:31.659431+00:00",
- "from_date": "2025-03-05T13:12:31.659431+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 3,
- "limit": 1,
- "next_page": 2
}
}Create a secret
Creates a new secret associated with the authenticated user.
- Only the user who created the secret can access the secret.
- The user must have at least one role assigned in at least one tenant.
- Each secret name must be unique.
Authorizations:
Request Body schema: application/jsonrequired
User secret to be created.
| secret_type required | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of authentication secret. |
| name required | string non-empty Secret identifier. |
| description | string or null non-empty Optional secret details. |
| principal required | string non-empty Kerberos principal. |
| keytab required | string non-empty Keytab file path. |
Responses
Request samples
- Payload
{- "secret_type": "AWS ACCESS SECRET KEY PAIR",
- "name": "aws_secret",
- "description": "AWS credentials for accessing S3",
- "type": "string",
- "project_id": "string",
- "private_key_id": "string",
- "private_key": "string",
- "client_email": "string",
- "client_id": "string",
- "auth_uri": "string",
- "token_uri": "string",
- "auth_provider_x509_cert_url": "string",
- "client_x509_cert_url": "string",
- "access_key": "string",
- "secret_key": "string",
- "universe_domain": "string",
- "aws_access_key_id": "AKIAXXXXXXX",
- "aws_secret_access_key": "abcd1234XXXX",
- "aws_default_region": "us-west-2"
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "user_secret_id": "1",
- "description": "Neo4j crdentials",
- "env_var_secret": "secretKeyName",
- "created_by_user_id": "1",
- "last_update_date": "2025-03-04T06:42:27.002Z",
- "from_date": "2025-03-04T06:42:27.002Z",
- "to_date": null
}Edit a secret
Updates the details of an existing user secret identified by user_secret_id.
Authorizations:
query Parameters
| user_secret_id required | integer <int64> The ID of the user secret to be edited. |
Request Body schema: application/jsonrequired
User secret to be updated.
| secret_type | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of authentication secret. |
| description | string or null non-empty Optional secret details. |
| principal | string non-empty Kerberos principal. |
| keytab | string non-empty Keytab file path. |
Responses
Request samples
- Payload
{- "secret_type": "AWS ACCESS SECRET KEY PAIR",
- "description": "AWS credentials for accessing S3",
- "type": "string",
- "project_id": "string",
- "private_key_id": "string",
- "private_key": "string",
- "client_email": "string",
- "client_id": "string",
- "auth_uri": "string",
- "token_uri": "string",
- "auth_provider_x509_cert_url": "string",
- "client_x509_cert_url": "string",
- "access_key": "string",
- "secret_key": "string",
- "universe_domain": "string",
- "aws_access_key_id": "AKIAXXXXXXX",
- "aws_secret_access_key": "abcd1234XXXX",
- "aws_default_region": "us-west-2"
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "user_secret_id": "1",
- "description": "Neo4j crdentials",
- "env_var_secret": "secretKeyName",
- "created_by_user_id": "1",
- "last_update_date": "2025-03-04T06:42:27.002Z",
- "from_date": "2025-03-04T06:42:27.002Z",
- "to_date": null
}Delete an existing user secret.
Deletes a user secret created by the user of the current session.
Authorizations:
query Parameters
| user_secret_id required | integer <int64> The ID of the user secret to be deleted. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted user secret id: 1."
}Get all the Users in a tenant
Retrieves a paginated list of all users within a tenant, providing user details in JSON format.
Authorizations:
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "user_id": 13,
- "username": "ru0002-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": "ru0002-yeedu@yeedu.io"
}, - {
- "user_id": 12,
- "username": "ru0001-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": "ru0001-yeedu@yeedu.io"
}, - {
- "user_id": 10,
- "username": "ysu0001-yeedu@yeedu.io",
- "display_name": "YSU0001",
- "email": "ysu0001-yeedu@yeedu.io"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 1,
- "limit": 100
}
}Search all the Users by username in a tenant
Searches users by username within a tenant, returning paginated user details matching the search term.
Authorizations:
query Parameters
| username required | string Username that will be used for filter |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "user_id": 13,
- "username": "ru0002-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": "ru0002-yeedu@yeedu.io"
}, - {
- "user_id": 12,
- "username": "ru0001-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": "ru0001-yeedu@yeedu.io"
}, - {
- "user_id": 10,
- "username": "ysu0001-yeedu@yeedu.io",
- "display_name": "YSU0001",
- "email": "ysu0001-yeedu@yeedu.io"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 1,
- "limit": 100
}
}Get User Details for a specific User Id
Retrieves detailed information for a specific user by their user ID within a tenant.
Authorizations:
path Parameters
| user_id required | integer <int64> User Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "user_id": 12,
- "username": "ru0001-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": "ru0001-yeedu@yeedu.io",
- "from_date": "2024-06-20T10:38:48.746144+00:00",
- "to_date": "infinity"
}Get all the User Roles for a specific User Id
User roles are filtered for a specific User Id and returned in the form of JSON
Authorizations:
path Parameters
| user_id required | integer <int64> User Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "user_id": 11,
- "username": "ysu0002-yeedu@yeedu.io",
- "email": "ysu0002-yeedu@yeedu.io",
- "user_roles": null,
- "group_roles": [
- {
- "group_id": 355,
- "group_name": "G_Yeedu_Infra",
- "group_mail": null,
- "group_object_id": "e56deqwaa-554f-485f-awdac-e409wda8f0",
- "group_type": null,
- "role_id": 4,
- "group_role": "Platform Billing"
}
], - "from_date": "2024-06-20T10:27:04.892889+00:00",
- "to_date": "infinity"
}Get all the Users and it's respective roles in a tenant
All the Users and it's respective roles in a tenant are returned in the form of JSON
Authorizations:
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io",
- "user_roles": [
- "Admin",
- "Platform Admin"
], - "group_roles": [
- null
]
}, - {
- "user_id": 2,
- "username": "ya0000-yeedu@yeedu.io",
- "user_roles": [
- "Admin"
], - "group_roles": [
- null
]
}, - {
- "user_id": 3,
- "username": "rp0000-yeedu@yeedu.io",
- "user_roles": [
- "Can Manage Cluster"
], - "group_roles": [
- null
]
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 1,
- "limit": 2
}
}Get all the Users for a specific Role Id
Users are filtered for a specific Role Id and returned in the form of JSON
Authorizations:
path Parameters
| role_id required | integer <int64> Role Id that will be used for filter |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "role_id": 2,
- "users": [
- {
- "user_id": 2,
- "username": "ya0000-yeedu@yeedu.io",
- "display_name": "ya0000-yeedu",
- "email": "ya0000-yeedu@yeedu.io"
}, - {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io",
- "display_name": "ysu0000-yeedu",
- "email": "ysu0000-yeedu@yeedu.io"
}
]
}, - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 1,
- "limit": 100
}
}Get all the Groups in a tenant
All the groups present in a tenant are returned in the form of JSON
Authorizations:
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "group_id": 602,
- "group_name": "G_Yeedu_Auditor",
- "group_mail": null,
- "group_object_id": "ba7d4601-2342-43d800",
- "group_type": null
}, - {
- "group_id": 355,
- "group_name": "G_Yeedu_Infra",
- "group_mail": null,
- "group_object_id": "e4d02b7-e80d409b48f0",
- "group_type": null
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 1,
- "limit": 100
}
}Search all the Groups by groupname in a tenant
Search all the Groups by groupname in a tenant
Authorizations:
query Parameters
| groupname required | string Group name that will be used for filter |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "group_id": 602,
- "group_name": "G_Yeedu_Auditor",
- "group_mail": null,
- "group_object_id": "ba7d4601-2342-43d800",
- "group_type": null
}, - {
- "group_id": 355,
- "group_name": "G_Yeedu_Infra",
- "group_mail": null,
- "group_object_id": "e4d02b7-e80d409b48f0",
- "group_type": null
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 1,
- "limit": 100
}
}Get Group Details for a specific Group Id
Group details are filtered for a specific Group Id and returned in the form of JSON
Authorizations:
path Parameters
| group_id required | integer <int64> Group Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "group_id": 602,
- "group_name": "G_Yeedu_Auditor",
- "group_object_id": "ba7d4601-57-a055-953d800",
- "group_type": null,
- "from_date": "2024-06-20T11:25:39.573594+00:00",
- "to_date": "infinity"
}Get all the Group Roles for a specific Group Id
Group Roles are filtered for a specific Group Id and returned in the form of JSON
Authorizations:
path Parameters
| group_id required | integer <int64> Group Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "group_id": 602,
- "group_name": "G_Yeedu_Auditor",
- "group_type": null,
- "roles": [
- "Admin"
]
}Get all the Groups and it's respective roles
Groups and it's respective roles are returned in the form of JSON
Authorizations:
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "group_id": 117,
- "group_name": "G_Yeedu_Manager",
- "group_mail": null,
- "group_type": null,
- "group_object_id": "bbdfd-c33a129d36d4",
- "roles": [
- "Can Manage Cluster"
]
}, - {
- "group_id": 167,
- "group_name": "G_Yeedu_Infra",
- "group_mail": null,
- "group_type": "Unified",
- "group_object_id": "ff35e0aa-a29cb5cb8415",
- "roles": [
- "Platform Billing"
]
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 5,
- "total_pages": 3,
- "limit": 2,
- "next_page": 2
}
}Get all the Groups for a specific Role Id
Groups are filtered for a specific Role Id and returned in the form of JSON
Authorizations:
path Parameters
| role_id required | integer <int64> Role Id that will be used for filter |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "role_id": 2,
- "groups": [
- {
- "group_id": 602,
- "group_name": "G_Yeedu_Auditor",
- "group_mail": null,
- "group_object_id": "ba7d4601-2342-aeb953d800",
- "group_type": null
}
]
}, - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Create a new User Role
Rules to assign a role to an user
- Platform Admin Role cannot be assigned to any user.
- Can assign all other roles to an user.
Authorizations:
path Parameters
| user_id required | integer <int64> User Id that will be used for adding role |
| role_id required | integer <int64> Role Id that will be used for adding role |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "user_roles_id": "36",
- "tenant_id": "9d6d3054-a5f6-4dbf-86f9-26989eb73ed3",
- "user_id": "9",
- "role_id": 2,
- "created_by_user_id": "2",
- "modified_by_user_id": "2",
- "last_update_date": "2024-06-21T17:03:17.197Z",
- "from_date": "2024-06-21T17:03:17.197Z",
- "to_date": null
}Delete an existing User Role for a specific User Id and Role Id
Rules to delete a role of an user
- Cannot delete a Platform Admin Role of any user.
- Cannot delete an Admin Role for itself.
- But can delete an Admin Role for other Admin users.
- Can delete all the other roles for a user.
Authorizations:
path Parameters
| user_id required | integer <int64> User Id that will be used for deleting role |
| role_id required | integer <int64> Role Id that will be used for deleting role |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted User Roles for the provided User Id: 9 and Role Id: 2"
}Create a new Group Role
Rules to assign a role to a group
- Platform Admin Role cannot be assigned to any group.
- Can assign all other roles to a group.
Authorizations:
path Parameters
| group_id required | integer <int64> Group Id that will be used for adding role |
| role_id required | integer <int64> Role Id that will be used for adding role |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "group_roles_id": "11",
- "tenant_id": "9d6d3054-a5f6-4dbf-86f9-26989eb73ed3",
- "group_id": "117",
- "role_id": 2,
- "created_by_user_id": "2",
- "modified_by_user_id": "2",
- "last_update_date": "2024-06-21T17:06:23.498Z",
- "from_date": "2024-06-21T17:06:23.498Z",
- "to_date": null
}Delete an existing Group Role for a specific Group Id and Role Id
Rules to delete a role of a group
- Cannot delete a Platform Admin Role of any group.
- Cannot delete an Admin Role for its own group.
- But can delete an Admin Role for other Admin groups.
- Can delete all the other roles for a group.
Authorizations:
path Parameters
| group_id required | integer <int64> Group Id that will be used for deleting role |
| role_id required | integer <int64> Role Id that will be used for deleting role |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted Group Role for the provided group ID: 117, and role ID: 2"
}Get all the Tenant Secrets matching with the Secret name
Retrieves a paginated list of tenant secrets filtered by secret name and optional secret type, returning matching secrets in JSON format for the tenant associated with the current user session.
Authorizations:
query Parameters
| secret_name required | string Secret name that will be used for filter |
| secret_type | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of secret to filter the secrets. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "data": [
- {
- "tenant_secret_id": 4,
- "description": null,
- "env_var_secret": "secretKeyName",
- "tenant": {
- "tenant_id": "28d4f82a-fa93-442b-b8fb-d8946ee3393f",
- "name": "test-tenant",
- "description": "Yeedu test environment tenant"
}, - "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-03-04T10:41:41.907762+00:00",
- "from_date": "2025-03-04T10:41:41.907762+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 2,
- "limit": 1,
- "next_page": 2
}
}
]Get all the tenant secrets.
Retrieves a paginated list of tenant secrets for the tenant linked to the current user session, optionally filtered by secret type or tenant secret ID.
Authorizations:
query Parameters
| secret_type | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of secret to filter the secrets. |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
| tenant_secret_id | string Filter secrets by a specific tenant secret ID. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "tenant_secret_id": 4,
- "description": null,
- "env_var_secret": "secretKeyName",
- "tenant": {
- "tenant_id": "28d4f82a-fa93-442b-b8fb-d8946ee3393f",
- "name": "test-tenant",
- "description": "Yeedu test environment tenant"
}, - "created_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000@yeedu.io"
}, - "last_update_date": "2025-03-04T10:41:41.907762+00:00",
- "from_date": "2025-03-04T10:41:41.907762+00:00",
- "to_date": "infinity"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 2,
- "limit": 1,
- "next_page": 2
}
}Create a new tenant secret.
Creates a new tenant secret for the tenant associated with the current user session.
- Only users with the Platform Admin role or Admin role within a tenant are authorized to create tenant secrets for the associated tenant.
- Each secret name must be unique within the tenant.
Authorizations:
Request Body schema: application/jsonrequired
Tenant secret to be created.
| secret_type required | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of authentication secret. |
| name required | string non-empty Secret identifier. |
| description | string or null non-empty Optional secret details. |
| principal required | string non-empty Kerberos principal. |
| keytab required | string non-empty Keytab file path. |
Responses
Request samples
- Payload
{- "secret_type": "AWS ACCESS SECRET KEY PAIR",
- "name": "aws_secret",
- "description": "AWS credentials for accessing S3",
- "type": "string",
- "project_id": "string",
- "private_key_id": "string",
- "private_key": "string",
- "client_email": "string",
- "client_id": "string",
- "auth_uri": "string",
- "token_uri": "string",
- "auth_provider_x509_cert_url": "string",
- "client_x509_cert_url": "string",
- "access_key": "string",
- "secret_key": "string",
- "universe_domain": "string",
- "aws_access_key_id": "AKIAXXXXXXX",
- "aws_secret_access_key": "abcd1234XXXX",
- "aws_default_region": "us-west-2"
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "tenant_secret_id": 789,
- "tenant_name": "aws_secret",
- "description": "AWS credentials for accessing S3",
- "tenant_id": "123e4567-e89b-12d3-a456-426614174000",
- "lookup_secret_type_id": 6,
- "created_by_user_id": 301,
- "modified_by_user_id": 302,
- "last_update_date": "2025-03-18T15:30:00Z",
- "from_date": "2025-03-10T00:00:00Z",
- "to_date": "9999-12-31T23:59:59Z",
- "secret_type": "AWS ACCESS SECRET KEY PAIR",
- "aws_access_key_id": "AKIAXXXXXXX",
- "aws_secret_access_key": "abcd1234XXXX",
- "region": "us-west-2"
}Update an existing tenant secret.
Updates the details of an existing tenant secret for the tenant associated with the current user session.
Authorizations:
query Parameters
| tenant_secret_id required | integer <int64> The ID of the tenant secret to be updated. |
Request Body schema: application/jsonrequired
Tenant secret details to be updated.
| secret_type | string Enum: "HIVE KERBEROS" "HIVE BASIC" "DATABRICKS UNITY TOKEN" "ENVIRONMENT VARIABLE" "AWS ACCESS SECRET KEY PAIR" "AZURE SERVICE PRINCIPAL" "GOOGLE SERVICE ACCOUNT" Type of authentication secret. |
| description | string or null non-empty Optional secret details. |
| principal | string non-empty Kerberos principal. |
| keytab | string non-empty Keytab file path. |
Responses
Request samples
- Payload
{- "secret_type": "AWS ACCESS SECRET KEY PAIR",
- "description": "AWS credentials for accessing S3",
- "type": "string",
- "project_id": "string",
- "private_key_id": "string",
- "private_key": "string",
- "client_email": "string",
- "client_id": "string",
- "auth_uri": "string",
- "token_uri": "string",
- "auth_provider_x509_cert_url": "string",
- "client_x509_cert_url": "string",
- "access_key": "string",
- "secret_key": "string",
- "universe_domain": "string",
- "aws_access_key_id": "AKIAXXXXXXX",
- "aws_secret_access_key": "abcd1234XXXX",
- "aws_default_region": "us-west-2"
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "tenant_secret_id": 789,
- "tenant_name": "aws_secret",
- "description": "AWS credentials for accessing S3",
- "tenant_id": "123e4567-e89b-12d3-a456-426614174000",
- "lookup_secret_type_id": 6,
- "created_by_user_id": 301,
- "modified_by_user_id": 302,
- "last_update_date": "2025-03-18T15:30:00Z",
- "from_date": "2025-03-10T00:00:00Z",
- "to_date": "9999-12-31T23:59:59Z",
- "secret_type": "AWS ACCESS SECRET KEY PAIR",
- "aws_access_key_id": "AKIAXXXXXXX",
- "aws_secret_access_key": "abcd1234XXXX",
- "region": "us-west-2"
}Delete an existing tenant secret.
Deletes a tenant secret for the tenant associated to the current user session.
Authorizations:
query Parameters
| tenant_secret_id required | integer <int64> The ID of the tenant secret to be deleted. |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted tenant secret id: 1."
}Get all the Tenants
This endpoint returns details of all tenants registered on the platform in JSON format. The response includes tenant metadata such as tenant ID, name, description, and status.
Authorizations:
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "data": [
- {
- "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "name": "test",
- "description": null,
- "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-20T17:16:28.24551+00:00",
- "from_date": "2024-06-20T17:16:28.24551+00:00",
- "to_date": "infinity"
}
], - "result_set": null,
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
]Get all the Tenants matching with the tenant name
Allows filtering tenants whose names match (or partially match) the provided tenant name query parameter. Useful for finding specific tenants quickly within large tenant lists.
Authorizations:
query Parameters
| tenant_name required | string Tenant name that will be used for filter |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "data": [
- {
- "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "name": "test",
- "description": null,
- "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-20T17:16:28.24551+00:00",
- "from_date": "2024-06-20T17:16:28.24551+00:00",
- "to_date": "infinity"
}
], - "result_set": null,
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
]Create a new Tenant
Creates a new tenant entity with detailed configuration data (such as tenant name, description) supplied in the request body.
Authorizations:
Request Body schema: application/jsonrequired
The Tenant to be added
| name required | string non-empty The
|
| description | string or null non-empty |
Responses
Request samples
- Payload
{- "name": "tenant",
- "description": "Tenant description"
}Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "tenant_id": "a6ca4076-575a-45b0-a6e9-9e54e1940d23",
- "name": "test",
- "description": null,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T17:18:54.215Z",
- "from_date": "2024-06-21T17:18:54.215Z",
- "to_date": null
}Get Tenant Details for a specific Tenant Id or Name
Tenant details are filtered for a specific Tenant Id or Name and returned in the form of JSON
Authorizations:
query Parameters
| tenant_id | string <uuid> Tenant Id that will be used for filter |
| tenant_name | string Tenant Name that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "tenant_id": "49a8817e-8b8e-4d76-a717-c33db6f7e018",
- "name": "test",
- "description": null,
- "created_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "modified_by": {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io"
}, - "last_update_date": "2024-06-20T17:16:28.24551+00:00",
- "from_date": "2024-06-20T17:16:28.24551+00:00",
- "to_date": "infinity"
}
], - "result_set": null,
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}Update an existing Tenant for a specific Tenant Id or Name
Updates the details of an existing tenant entity for a specific Tenant Id or Name. The request body should contain the updated tenant information.
Authorizations:
query Parameters
| tenant_id | string <uuid> Tenant Id that will be used for Modification |
| tenant_name | string Tenant Name that will be used for Modification |
Request Body schema: application/jsonrequired
The Tenant to be modified
| name | string non-empty The
|
| description | string or null non-empty |
Responses
Request samples
- Payload
{- "name": "test_2",
- "description": "test_tenant"
}Response samples
- 201
- 400
- 401
- 403
- 404
- 409
- 500
{- "tenant_id": "a6ca4076-575a-45b0-a6e9-9e54e1940d23",
- "name": "test",
- "description": null,
- "created_by_user_id": "1",
- "modified_by_user_id": "1",
- "last_update_date": "2024-06-21T17:18:54.215Z",
- "from_date": "2024-06-21T17:18:54.215Z",
- "to_date": null
}Delete an existing Tenant for a specific Tenant Id or Name
Deletes an existing tenant entity for a specific Tenant Id or Name. The request must specify either the tenant ID or tenant name to identify the tenant to be deleted.
Authorizations:
query Parameters
| tenant_id | string <uuid> Tenant Id that will be used for Deletion |
| tenant_name | string Tenant Name that will be used for Deletion |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted tenant id: a6ca4076-575a-45b0-a6e9-9e54e1940d23."
}Get all the Users in a Tenant
Fetch all users associated with the tenant specified by tenant_id. Supports pagination to limit and offset the result set.
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "user_id": 13,
- "username": "ru0002-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": "ru0002-yeedu@yeedu.io"
}, - {
- "user_id": 12,
- "username": "ru0001-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": "ru0001-yeedu@yeedu.io"
}, - {
- "user_id": 10,
- "username": "ysu0001-yeedu@yeedu.io",
- "display_name": "YSU0001",
- "email": "ysu0001-yeedu@yeedu.io"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 1,
- "limit": 100
}
}Search all the users by username in a Tenant
Search all the users by username present in a tenant
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
query Parameters
| username required | string Username that will be used for filter |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "user_id": 13,
- "username": "ru0002-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": "ru0002-yeedu@yeedu.io"
}, - {
- "user_id": 12,
- "username": "ru0001-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": "ru0001-yeedu@yeedu.io"
}, - {
- "user_id": 10,
- "username": "ysu0001-yeedu@yeedu.io",
- "display_name": "YSU0001",
- "email": "ysu0001-yeedu@yeedu.io"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 1,
- "limit": 100
}
}Get all the Groups in a Tenant
All the groups present in a tenant are returned in the form of JSON
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "group_id": 602,
- "group_name": "G_Yeedu_Auditor",
- "group_mail": null,
- "group_object_id": "ba7d4601-2342-43d800",
- "group_type": null
}, - {
- "group_id": 355,
- "group_name": "G_Yeedu_Infra",
- "group_mail": null,
- "group_object_id": "e4d02b7-e80d409b48f0",
- "group_type": null
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 1,
- "limit": 100
}
}Search all the Groups by groupname in a Tenant
Search all the Groups by groupname in a Tenant are returned in the form of JSON
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
query Parameters
| groupname required | string Group name that will be used for filter |
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "group_id": 602,
- "group_name": "G_Yeedu_Auditor",
- "group_mail": null,
- "group_object_id": "ba7d4601-2342-43d800",
- "group_type": null
}, - {
- "group_id": 355,
- "group_name": "G_Yeedu_Infra",
- "group_mail": null,
- "group_object_id": "e4d02b7-e80d409b48f0",
- "group_type": null
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 1,
- "limit": 100
}
}Get User Details for a specific Tenant Id and User Id
User details are filtered for a specific Tenant Id along with User Id and returned in the form of JSON
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
| user_id required | integer <int64> User Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "user_id": 12,
- "username": "ru0001-yeedu@yeedu.io",
- "display_name": "RU0001",
- "email": "ru0001-yeedu@yeedu.io",
- "from_date": "2024-06-20T10:38:48.746144+00:00",
- "to_date": "infinity"
}Get Group Details for a specific Tenant Id and Group Id
Group details are filtered for a specific Tenant Id along with Group Id and returned in the form of JSON
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
| group_id required | integer <int64> Group Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "group_id": 602,
- "group_name": "G_Yeedu_Auditor",
- "group_object_id": "ba7d4601-57-a055-953d800",
- "group_type": null,
- "from_date": "2024-06-20T11:25:39.573594+00:00",
- "to_date": "infinity"
}Get all the Users and it's respective roles in a tenant
All the Users and it's respective roles in a tenant are returned in the form of JSON
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io",
- "user_roles": [
- "Admin",
- "Platform Admin"
], - "group_roles": [
- null
]
}, - {
- "user_id": 2,
- "username": "ya0000-yeedu@yeedu.io",
- "user_roles": [
- "Admin"
], - "group_roles": [
- null
]
}, - {
- "user_id": 3,
- "username": "rp0000-yeedu@yeedu.io",
- "user_roles": [
- "Can Manage Cluster"
], - "group_roles": [
- null
]
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 1,
- "limit": 2
}
}Get User Roles for a specific Tenant Id and User Id
User roles are filtered for a specific Tenant Id along with User Id and returned in the form of JSON
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
| user_id required | integer <int64> User Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "user_id": 11,
- "username": "ysu0002-yeedu@yeedu.io",
- "email": "ysu0002-yeedu@yeedu.io",
- "user_roles": null,
- "group_roles": [
- {
- "group_id": 355,
- "group_name": "G_Yeedu_Infra",
- "group_mail": null,
- "group_object_id": "e56deqwaa-554f-485f-awdac-e409wda8f0",
- "group_type": null,
- "role_id": 4,
- "group_role": "Platform Billing"
}
], - "from_date": "2024-06-20T10:27:04.892889+00:00",
- "to_date": "infinity"
}Get Tenants for a specific User Id
Tenants are filtered for a specific User Id and returned in the form of JSON
Authorizations:
path Parameters
| user_id required | integer <int64> User Id that will be used for filter |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "tenant_id": "ffafef39-1948-4cd2-81b0-8243eadf6f75",
- "name": "manual_ui_test",
- "description": null
}, - {
- "tenant_id": "59f0724b-d711-47ab-8010-497f1f715735",
- "name": "integration_test_tenant",
- "description": "Creating_tenant_integration_test_tenant"
}, - {
- "tenant_id": "cf1f945f-01ce-4ac6-a070-8c733f2fa791",
- "name": "ui_testing",
- "description": "creating tenant ui_testing"
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 3,
- "total_pages": 1,
- "limit": 100
}
}Get all the Groups and it's respective roles
Groups and it's respective roles are returned in the form of JSON
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": [
- {
- "group_id": 117,
- "group_name": "G_Yeedu_Manager",
- "group_mail": null,
- "group_type": null,
- "group_object_id": "bbdfd-c33a129d36d4",
- "roles": [
- "Can Manage Cluster"
]
}, - {
- "group_id": 167,
- "group_name": "G_Yeedu_Infra",
- "group_mail": null,
- "group_type": "Unified",
- "group_object_id": "ff35e0aa-a29cb5cb8415",
- "roles": [
- "Platform Billing"
]
}
], - "result_set": {
- "current_page": 1,
- "total_objects": 5,
- "total_pages": 3,
- "limit": 2,
- "next_page": 2
}
}Get Group Roles for a specific Tenant and Group Id
Group Roles are filtered for a specific Tenant Id along with Group Id and returned in the form of JSON
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
| group_id required | integer <int64> Group Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "group_id": 602,
- "group_name": "G_Yeedu_Auditor",
- "group_type": null,
- "roles": [
- "Admin"
]
}Get Users for a specific Tenant Id and Role Id
Users are filtered for a specific Tenant Id along with Role Id and returned in the form of JSON
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
| role_id required | integer <int64> Role Id that will be used for filter |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "role_id": 2,
- "users": [
- {
- "user_id": 2,
- "username": "ya0000-yeedu@yeedu.io",
- "display_name": "ya0000-yeedu",
- "email": "ya0000-yeedu@yeedu.io"
}, - {
- "user_id": 1,
- "username": "ysu0000-yeedu@yeedu.io",
- "display_name": "ysu0000-yeedu",
- "email": "ysu0000-yeedu@yeedu.io"
}
]
}, - "result_set": {
- "current_page": 1,
- "total_objects": 2,
- "total_pages": 1,
- "limit": 100
}
}Get Groups for a specific Tenant Id and Role Id
Groups are filtered for a specific Tenant Id along with Role Id and returned in the form of JSON
Authorizations:
path Parameters
| tenant_id required | string <uuid> Specifies the ID of the tenant for filtering. |
| role_id required | integer <int64> Role Id that will be used for filter |
query Parameters
| limit | integer <int32> Default: 100 The numbers of items to return. |
| pageNumber | integer <int32> Default: 1 The page number for the items to return. |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
{- "data": {
- "role_id": 2,
- "groups": [
- {
- "group_id": 602,
- "group_name": "G_Yeedu_Auditor",
- "group_mail": null,
- "group_object_id": "ba7d4601-2342-aeb953d800",
- "group_type": null
}
]
}, - "result_set": {
- "current_page": 1,
- "total_objects": 1,
- "total_pages": 1,
- "limit": 100
}
}Create a new User Role
Rules to assign a role to an user
- If the role_id provided is of platform Admin and tenant_id is NULL then allow to create the role.
- If the role_id provided is of platform Admin and tenant_id is not NULL then do not allow to create the role otherwise the user will be bound to a specific tenant.
- For other role_id the tenant_id will be required.
Authorizations:
path Parameters
| user_id required | integer <int64> User Id that will be used for adding role |
| role_id required | integer <int64> Role Id that will be used for adding role |
query Parameters
| tenant_id | string <uuid> Tenant Id that will be used for adding role |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "user_roles_id": "36",
- "tenant_id": "9d6d3054-a5f6-4dbf-86f9-26989eb73ed3",
- "user_id": "9",
- "role_id": 2,
- "created_by_user_id": "2",
- "modified_by_user_id": "2",
- "last_update_date": "2024-06-21T17:03:17.197Z",
- "from_date": "2024-06-21T17:03:17.197Z",
- "to_date": null
}Delete an existing User Role for a specific Tenant, User and Role Id
Rules to delete a role of an user
- Cannot delete a Platform Admin Role for itself.
- But can delete the Platform Admin Role of other Platform Admin users.
Authorizations:
path Parameters
| user_id required | integer <int64> User Id that will be used for deleting role |
| role_id required | integer <int64> Role Id that will be used for deleting role |
query Parameters
| tenant_id | string <uuid> Tenant Id that will be used for deleting role |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted User Role for the provided Tenant Id: '83d9056b-a3b3-4eb3-80db-1ed364428b0e', User Id: 2, and Role Id: 1"
}Create a new Group Role
Rules to assign a role to a group
- Platform Admin Role cannot be assigned to any group.
- Can assign all other roles to a group across tenants.
Authorizations:
path Parameters
| group_id required | integer <int64> Group Id that will be used for adding role |
| role_id required | integer <int64> Role Id that will be used for adding role |
query Parameters
| tenant_id | string <uuid> Tenant Id that will be used for adding role |
Responses
Response samples
- 200
- 400
- 401
- 403
- 404
- 409
- 500
{- "group_roles_id": "11",
- "tenant_id": "9d6d3054-a5f6-4dbf-86f9-26989eb73ed3",
- "group_id": "117",
- "role_id": 2,
- "created_by_user_id": "2",
- "modified_by_user_id": "2",
- "last_update_date": "2024-06-21T17:06:23.498Z",
- "from_date": "2024-06-21T17:06:23.498Z",
- "to_date": null
}Delete an existing Group Role for a specific Tenant, Group and Role Id
Rules to delete a role of a group
- Cannot delete a Platform Admin Role for its own group.
- But can delete the Platform Admin Role of other Platform Admin groups.
Authorizations:
path Parameters
| group_id required | integer <int64> Group Id that will be used for deleting role |
| role_id required | integer <int64> Role Id that will be used for deleting role |
query Parameters
| tenant_id | string <uuid> Tenant Id that will be used for deleting role |
Responses
Response samples
- 201
- 400
- 401
- 403
- 404
- 500
{- "message": "Deleted Group Role for the provided Tenant Id: '83d9056b-a3b3-4eb3-80db-1ed364428b0e', Group Id: 1, and Role Id: 1"
}Get all the Resources
This endpoint returns a complete list of all resources available in the system in JSON array format.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- [
- {
- "resource_id": 0,
- "resource_path": "/api/v1/lookup_cloud_providers",
- "from_date": "2023-01-31T05:25:41.974Z",
- "to_date": null
}
]
]Get Resource details for a specific Resource Id
Fetches detailed metadata for the resource identified by the id path parameter. Useful for inspecting individual resource.
Authorizations:
path Parameters
| id required | integer <int64> Resource Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "resource_id": 0,
- "resource_path": "/api/v1/lookup_cloud_providers",
- "from_date": "2023-01-31T05:25:41.974Z",
- "to_date": null
}
]Get all the Permission types
Returns a list of all permission types defined for controlling access across resources.
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- [
- {
- "permission_id": 0,
- "permission": "GET",
- "description": "To list a resource",
- "from_date": "2024-06-14T08:40:52.540Z",
- "to_date": null
}
]
]Get Permission type details for a specific Permission type Id
Permission type details are filtered for a specifc Permission type Id and returned in the form of JSON
Authorizations:
path Parameters
| id required | integer <int64> Permission type Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "permission_id": 0,
- "permission": "GET",
- "description": "To list a resource",
- "from_date": "2024-06-14T08:40:52.540Z",
- "to_date": null
}
]Response samples
- 200
- 400
- 401
- 404
- 500
[- [
- {
- "role_id": 0,
- "role": "User",
- "description": "GET (Lookup, Volume Config, Network Config, Boot Disk Image Config, Credentials Config, Object Storage Manager, Object Storage Manager Files, Hive Metastore Config, Cluster Configuration, Cluster Instance, Cluster Access Control, Workspace, Workspace Access Control, Spark Job, Spark Job run, Notebook, Notebook run, User) PUT (Spark Job, Workspace, Notebook) POST (Object Storage Manager Files, Workspace, Workspace Access Control, Spark Job, Spark Job run, Notebook, Notebook Run) DELETE (Workspace Access Control)",
- "from_date": "2024-06-14T08:40:52.666Z",
- "to_date": null
}
]
]Get Role details for a specific Role Id
Role details are filtered for a specific Role Id and returned in the form of JSON
Authorizations:
path Parameters
| id required | integer <int64> Role Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "role_id": 0,
- "role": "User",
- "description": "GET (Lookup, Volume Config, Network Config, Boot Disk Image Config, Credentials Config, Object Storage Manager, Object Storage Manager Files, Hive Metastore Config, Cluster Configuration, Cluster Instance, Cluster Access Control, Workspace, Workspace Access Control, Spark Job, Spark Job run, Notebook, Notebook run, User) PUT (Spark Job, Workspace, Notebook) POST (Object Storage Manager Files, Workspace, Workspace Access Control, Spark Job, Spark Job run, Notebook, Notebook Run) DELETE (Workspace Access Control)",
- "from_date": "2024-06-14T08:40:52.666Z",
- "to_date": null
}
]Response samples
- 200
- 400
- 401
- 404
- 500
[- [
- {
- "rule_id": "754",
- "permission_type": {
- "permission_id": 0,
- "permission": "GET",
- "description": "To list a resource"
}, - "resource": {
- "resource_id": 194,
- "resource_path": "/api/v1/workspace/stats"
}, - "role": {
- "role_id": 3,
- "role": "Platform Admin",
- "description": "Can access and add or remove tenants across all tenants"
}, - "from_date": "2024-06-14T08:40:52.728Z",
- "to_date": null
}
]
]Get Rule details for a specific Rule Id
Rule details are filtered for a specific Rule Id and returned in the form of JSON
Authorizations:
path Parameters
| id required | integer <int64> Rule Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "rule_id": "754",
- "permission_type": {
- "permission_id": 0,
- "permission": "GET",
- "description": "To list a resource"
}, - "resource": {
- "resource_id": 194,
- "resource_path": "/api/v1/workspace/stats"
}, - "role": {
- "role_id": 3,
- "role": "Platform Admin",
- "description": "Can access and add or remove tenants across all tenants"
}, - "from_date": "2024-06-14T08:40:52.728Z",
- "to_date": null
}
]Get all the Workspace Access Control Permissions
Workspace Access Control Permissions are returned in the form of JSON list
Authorizations:
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- [
- {
- "permission_id": 0,
- "name": "VIEW",
- "description": "To list the jobs inside the workspace",
- "from_date": "2024-06-14T08:40:53.788Z",
- "to_date": null
}
]
]Get a Workspace Access Control Permission by Permission Id
Returns metadata and attributes for a workspace permission identified by permission_id.
Authorizations:
path Parameters
| permission_id required | integer <int64> Permission Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "permission_id": 0,
- "name": "VIEW",
- "description": "To list the jobs inside the workspace",
- "from_date": "2024-06-14T08:40:53.788Z",
- "to_date": null
}
]Get all the Rules for a specific Permission type Id
Rule details are filtered for a specific permission type Id and returned in the form of JSON list
Authorizations:
path Parameters
| auth_permissions_type_id required | integer <int64> Permission type Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- [
- {
- "rule_id": "754",
- "permission_type": {
- "permission_id": 0,
- "permission": "GET",
- "description": "To list a resource"
}, - "resource": {
- "resource_id": 194,
- "resource_path": "/api/v1/workspace/stats"
}, - "role": {
- "role_id": 3,
- "role": "Platform Admin",
- "description": "Can access and add or remove tenants across all tenants"
}, - "from_date": "2024-06-14T08:40:52.728Z",
- "to_date": null
}
]
]Get Rule details for a specific Rule Id and Permission type Id
Rule details are filtered for a specific Rule Id along with Permission type Id and returned in the form of JSON
Authorizations:
path Parameters
| id required | integer <int64> Rule Id that will be used for filter |
| auth_permissions_type_id required | integer <int64> Permission type Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "rule_id": "754",
- "permission_type": {
- "permission_id": 0,
- "permission": "GET",
- "description": "To list a resource"
}, - "resource": {
- "resource_id": 194,
- "resource_path": "/api/v1/workspace/stats"
}, - "role": {
- "role_id": 3,
- "role": "Platform Admin",
- "description": "Can access and add or remove tenants across all tenants"
}, - "from_date": "2024-06-14T08:40:52.728Z",
- "to_date": null
}
]Get all the Rules for a specific Resource Id
Rule details are filtered for a specific Resource Id and returned in the form of JSON list
Authorizations:
path Parameters
| auth_resources_id required | integer <int64> Resource Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- [
- {
- "rule_id": "754",
- "permission_type": {
- "permission_id": 0,
- "permission": "GET",
- "description": "To list a resource"
}, - "resource": {
- "resource_id": 194,
- "resource_path": "/api/v1/workspace/stats"
}, - "role": {
- "role_id": 3,
- "role": "Platform Admin",
- "description": "Can access and add or remove tenants across all tenants"
}, - "from_date": "2024-06-14T08:40:52.728Z",
- "to_date": null
}
]
]Get Rule details for a specific Resource Id and Rule Id
Rules details are filtered for a specific Resources Id along with Rule Id and returned in the form of JSON
Authorizations:
path Parameters
| id required | integer <int64> Rule Id will be used for filter |
| auth_resources_id required | integer <int64> Resource Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "rule_id": "754",
- "permission_type": {
- "permission_id": 0,
- "permission": "GET",
- "description": "To list a resource"
}, - "resource": {
- "resource_id": 194,
- "resource_path": "/api/v1/workspace/stats"
}, - "role": {
- "role_id": 3,
- "role": "Platform Admin",
- "description": "Can access and add or remove tenants across all tenants"
}, - "from_date": "2024-06-14T08:40:52.728Z",
- "to_date": null
}
]Get all the Rules for a specific Permission type Id and Resource Id
Rule details are filtered for a specific Permission type Id along with Resource Id and returned in the form of JSON list
Authorizations:
path Parameters
| auth_resources_id required | integer <int64> Resource Id that will be used for filter |
| auth_permissions_type_id required | integer <int64> Permission type Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- [
- {
- "rule_id": "754",
- "permission_type": {
- "permission_id": 0,
- "permission": "GET",
- "description": "To list a resource"
}, - "resource": {
- "resource_id": 194,
- "resource_path": "/api/v1/workspace/stats"
}, - "role": {
- "role_id": 3,
- "role": "Platform Admin",
- "description": "Can access and add or remove tenants across all tenants"
}, - "from_date": "2024-06-14T08:40:52.728Z",
- "to_date": null
}
]
]Get Rule details for a specific Rule Id, Permission type Id, and Resource Id
Rule details are filtered for a specific Rule Id, Permission type Id and Resource Id is returned in the form of JSON
Authorizations:
path Parameters
| id required | integer <int64> Rule Id that will be used for filter |
| auth_resources_id required | integer <int64> Resource Id that will be used for filter |
| auth_permissions_type_id required | integer <int64> Permission type Id that will be used for filter |
Responses
Response samples
- 200
- 400
- 401
- 404
- 500
[- {
- "rule_id": "754",
- "permission_type": {
- "permission_id": 0,
- "permission": "GET",
- "description": "To list a resource"
}, - "resource": {
- "resource_id": 194,
- "resource_path": "/api/v1/workspace/stats"
}, - "role": {
- "role_id": 3,
- "role": "Platform Admin",
- "description": "Can access and add or remove tenants across all tenants"
}, - "from_date": "2024-06-14T08:40:52.728Z",
- "to_date": null
}
]