Data Studio's REST API provides a RESTful interface which can be integrated programmatically, allowing you to fully automate or speed up certain operations.

API reference

The full interactive API reference is available in Swagger UI, allowing you to test resources and get a better understanding of all requests and parameters.

You can access it at http://<server>/api/docs/index.html. If you've installed Data Studio locally using the default server port, the URL will be http://localhost:7701/api/docs/index.html.

With our easy to use API you can:

You can use the REST API to create and manage Datasets.

Create a Dataset

When creating a Dataset, the following parameters can be supplied:

Parameters Description
datasetName The name of the Dataset to be uploaded.

Type: String
externalLabel (Optional) The external label of the Dataset to be uploaded.

Type: String
space The Space that the Dataset will be uploaded to.

Type: String
Default value: Your space
fileKey The file key to an uploaded file. This is obtained by calling the Upload operation. The uploaded file will be removed after a successful load.

Type: String
filepath The file path of the cloud storage that the Dataset will be uploaded from.
Type: String
autotag (Optional) If 'True', enables the auto-tagging option for the Dataset to be uploaded.

Type: Boolean
Default value: True
Valid options: True, False
type The external system type (as defined in Data Studio) used to upload the Dataset.

Type: String
Default value: Amazon_S3
name The external system name (as defined in Data Studio) used to upload the Dataset.
Type: String
credentialName The external system credential name (as defined in Data Studio) used to upload the Dataset.
Type: String
characterSet Character set for file based parsers.

Type: String
Example value: UTF-8
delimiter Delimiter character for delimited file parsers.

Type: String
Example value: ,
quoteCharacter Enclosing quote character for delimited file parsers.

Type: String
Example value: "
useFirstRowForColumnNames Treat the first row as containing column names for a file based.
Type: Boolean
newlineCharsInQuotesAreNewRows If a new line character is found within a quoted value, treat it as a new line.
Type: Boolean
datatypeHandling Defines how data types are allocated to columns.

Type: String
Valid options: Detect, Alphanumeric, Mixed
startFromRow Import data starting from the specified row.

Type: Integer
Default value: 1
parserType The type of parser to use

Type: String
Example value: Csv
Valid options: Default, Csv, FixedWidth
columnName The name of the column. Headers in the data file are not used.

Type: String
Example value: CustomerName
startPosition TStart position of the column. 1 indexed, i.e. the first character in the file is position 1.

Type: Integer
Example value: 1
endPosition End position of the column. This is the last character included in the column. An END_POSITION of 12 will include the character at position 12.

Type: Integer
Example value: 12
length Length of the column. This is needed for validation.

Type: Integer
Example value: 12
dataType The data type of the column.

Type: String
Example value: Alphanumeric
summary Column description.

Type: String
Example value: The surname of the customer

Creating a Dataset will return a statusID which you can use to check the creation status of the Dataset.

Retrieve a Dataset

There are three ways to retrieve a Dataset:

  1. Retrieve a list of Datasets based on the tenancyId or tenancy external label only.
  2. Retrieve a specific Dataset based on the combination of: tenancyId or tenancy external label, spaceId or space external label and datasetId or dataset external label.
  3. Retrieve a list of Datasets in a Space based on the combination of: tenancyId or tenancy external label and spaceId or space external label.

The returned information is:

  • Dataset ID
  • Dataset UUID
  • Dataset type
  • Dataset name
  • Dataset summary
  • Dataset description
  • Dataset external label
  • Dataset enableDropZone settings
  • Dataset publish OData settings
  • Dataset batch settings
  • Dataset batch limit
  • Dataset allow batch deletion settings
  • Dataset allow auto refresh settings
  • Space where Dataset is uploaded
  • Space ID where Dataset is uploaded
  • Source Type which using to upload the Dataset:
    • External system (This is only applicable for source type external system.)
    • Credentials (This is only applicable for source type external system.)
  • Dataset tables:
    • Table UUID
    • Table name
    • Table summary
    • Table description
    • Table Columns
      • Column UUID
      • Column data type
      • Column name
      • Column description
      • Column tags

Delete a Dataset

To delete a required Dataset from Data Studio, you have to use the combination of spaceId or space external label and datasetId or dataset external label. Once deleted, a successful operation with no content will be returned.

You can use the REST API to create and manage Spaces.

Create a Space

When creating a Space, the following parameters can be supplied:

Parameters Description
name The name of the Sace.

Type: String
description The description of the Space.

Type: String
externalLabel The external label of the Space.

Type: String
allowDataExposure Allow publishing of Datasets to ODBC and OData for this Space.

Type: Boolean
permissionsType Space permission type.

Type: String
Valid options: ADMIN, READER, WRITER
type Space permission user/user group type.

Type: String
Valid options: USER, USER_GROUP
id The ID of the user/user group.

Type: Interger
uuid The UUID of the user/user group.

Type: String
name The name of the user/user group.

Type: String
username The username of the user .

Type: String
schema The Space schema.

Type: String

Retrieve a Space

There are two ways to retrieve a Space:

  1. Retrieve a list of Spaces based on the tenancyId or tenancy external label only.
  2. Retrieve a specific Space based on the combination of: tenancyId or tenancy external label and spaceId.

The returned information is:

  • Space name
  • Space description
  • Space external label
  • Space allow data exposure status
  • Space permissions details:
    • Space permission type
    • Space assigned user/user group type
    • Space assigned user/user group ID
    • Space assigned user/user group UUID
    • Space assigned user/user group name
    • Space assigned username
  • Space schema
  • Space ID
  • Space UUID

Update/Patch a Space

To update/patch a Space, you have to use the combination of tenancyId or tenancy external label and spaceId parameter which is defined in the Space properties. The following parameters have to also be supplied:

Parameters Description
name The name of the Space.

Type: String
description The description of the Space.

Type: String
externalLabel The external label of the Space.

Type: String
allowDataExposure Allow publishing of Datasets to ODBC and OData for this Space.

Type: Boolean
permissionsType Space permission type.

Type: String
Valid options: ADMIN, READER, WRITER
type Space permissions user/user group type.

Type: String
Valid options: USER, USER_GROUP
id The ID of the user/user group.

Type: Interger
uuid The UUID of the user/user group.

Type: String
name The name of the user/user group.

Type: String
username The username of the user .

Type: String
schema The schema of the Space.

Type: String

Once updated/patched, the Space information will be changed accordingly.

Delete a Space

To delete a Space you have to use the combination of tenancyId or tenancy external label and spaceId. Once deleted, a successful operation with no content will be returned.

You can use the REST API to create and manage users.

Create a user

When creating a user, the following parameters can be supplied:

Parameters Description
name The friendly display name of the user.

Type: String
username The unique user ID to use when logging in. (Only used for LDAP and SAML authentication).

Type: String
email The email address of the user. For standard authentication, this is used to log in with.

Type: String
id The ID of the user role

Type: Integer
uuid The UUID of the user role.

Type: String
name The name of the user role.

Type: String
isDisabled The status of the user

Type: Boolean
isInstallationManager The installation manager status of the user

Type: Boolean
authenticationType The authentication type of the user. For SAML, use Internal (default) as SAML is an exclusive authentication in Data Studio. Note that you have to enable SAML in Data Studio.

Type: String
Default value: Internal
Valid options: Internal, Ldap
password The user password for the first login. When SAML is enabled, this field is ignored

Type: String

Retrieve a user

There are two ways to retrieve a user:

  1. Retrieve a list of users based on the tenancyId or tenancy external label only.
  2. Retrieve a specific user based on the combination of: tenancyId or tenancy external label and userId.

The returned information is:

  • User name
  • Username for login
  • User email
  • Role details:
    • Role ID
    • Role UUID
    • Role name
  • User status
  • User installation manager status
  • User authentication type
  • User ID
  • User UUID
  • User password expired status
  • User last login date
  • User locked until date
  • User super admin status

Update/Patch a user

To update/patch a user, you have to use the combination of tenancyId or tenancy external label and userId parameter which is defined in the User properties. The following parameters have to also be supplied:

Parameters Description
name The friendly display name of the user.

Type: String
username The unique user ID to use when logging in. (Only used for LDAP and SAML authentication).

Type: String
email The email address of the user. For standard authentication, this is used to log in with.

Type: String
id The ID of the user role

Type: Integer
uuid The UUID of the user role.

Type: String
name The name of the user role.

Type: String
isDisabled The status of the user

Type: Boolean
isInstallationManager The installation manager status of the user

Type: Boolean
authenticationType The authentication type of the user. For SAML, use Internal (default) as SAML is an exclusive authentication in Data Studio. Note that you have to enable SAML in Data Studio.

Type: String
Default value: Internal
Valid options: Internal, Ldap
password The user password for the first login. When SAML is enabled, this field is ignored

Type: String

Once updated/patched, the user information will be changed accordingly.

Delete a user

To delete a user, you have to use the combination of tenancyId or tenancy external label and userId. Once deleted, a successful operation with no content will be returned.

You can use the REST API to create and manage user groups.

Create a user group

When creating a user group, the following parameters can be supplied:

Parameters Description
name The user group name.

Type: String
summary The summary of the user group.

Type: String
description The description of the user group.

Type: String
memberType The permission of the user in the user group

Type: String
Valid options: Manager, Member
id The ID of the user

Type: Integer
uuid The UUID of the user.

Type: String
username The username of the user.

Type: String

Retrieve a user group

There are two ways to retrieve a user group:

  1. Retrieve a list of user groups based on the tenancyId or tenancy external label only.
  2. Retrieve a specific user group based on the combination of: tenancyId or tenancy external label and userGroupId.

The returned information is:

  • User group name
  • User group summary
  • User group description
  • The user group member details:
    • User member type
    • User ID
    • User UUID
    • Username
  • User group ID
  • User group UUID

Update/Patch a user group

To update/patch a user group, you have to use the combination of tenancyId or tenancy external label and userGroupId parameter which is defined in the User Group properties. The following parameters have to also be supplied:

Parameters Description
name The user group name.

Type: String
summary The summary of the user group.

Type: String
description The description of the user group.

Type: String
memberType The permission of the user in the user group

Type: String
Valid options: Manager, Member
id The ID of the user

Type: Integer
uuid The UUID of the user.

Type: String
username The username of the user.

Type: String

Once updated/patched, the user group information will be changed accordingly.

Delete a User Group

To delete a user group, you have to use the combination of tenancyId or tenancy external label and userGroupId. Once deleted, a successful operation with no content will be returned.

You can use the REST API to create and manage user roles.

Create a role

When creating a role, the following parameters can be supplied:

Parameters Description
name The role name.

Type: String
summary The summary of the role.

Type: String
description The description of the role.

Type: String
lockedDown If set 'true', only users that have been selected as manager of the role will be able to assign the role to other users.

Type: Boolean
id The ID of the role manager

Type: Integer
uuid The UUID of the role manager.

Type: String
userId The user ID of the role manager.

Type: String
capability A list of capabilities assigned to the role.

Type: String

Retrieve a role

There are two ways to retrieve a role:

  1. Retrieve a list of roles based on the tenancyId or tenancy external label only.
  2. Retrieve a specific role based on the combination of: tenancyId or tenancy external label and roleId.

The returned information is:

  • Role name
  • Role summary
  • Role description
  • Role locked down settings
  • Role manager details:
    • Role manager ID
    • Role manager UUID
    • Role manager user ID
    • Role manager name
  • Role capabilities
  • Role ID
  • Role UUID

Update/Patch a role

To update/patch a role, you have to use the combination of tenancyId or tenancy external label and roleId parameter which is defined in the Role properties. The following parameters have to also be supplied:

Parameters Description
name The role name.

Type: String
summary The summary of the role.

Type: String
description The description of the role.

Type: String
lockedDown If set 'true', only users that have been selected as manager of the role will be able to assign the role to other users.

Type: Boolean
id The ID of the role manager

Type: Integer
uuid The UUID of the role manager.

Type: String
userId The user ID of the role manager.

Type: String
capability A list of capabilities assigned to the role

Type: String

Once updated/patched, the role information will be changed accordingly.

Delete a role

To delete a role, you have to use the combination of tenancyId or tenancy external label and roleId. Once deleted, a successful operation with no content will be returned.

You can use our API to execute a specific Workflow or to get status of currently running/recently run Workflows.

Execute a Workflow

To execute a specific Workflow, you have to use the external label parameter which is defined in the Workflow properties. The following parameters have to also be supplied:

Parameters Description
externalLabel The external label specified in Workflow properties.

Type: String
versionToExecute Select whether to execute the latest draft or the last published version of the Workflow.

Type: String
Valid options: PreferDraft, PublishedOnly
dependenciesToUse When Functions and Views are used in the Workflow, select whether to execute the latest draft or the last published version of the Workflow.

Type: String
Valid options: PreferDraft, PublishedOnly
refreshSources Determines whether source data should be refreshed before executing the Workflow. Type: Boolean
workflowParameters Defines the Workflow parameter values that will be used in the execution.Type: String
sources Defines the data source that will be used in the execution. Two properties per source are accepted: the Source step's and the Dataset's external label.

The Source step's external label is used to ensure that the correct Source step is configured. The Can supply source when executed option has to be enabled for this Workflow step to make the data source replaceable at Workflow execution.

The Dataset's external label is used to refer to the Dataset which will be used by the Source step when the Workflow is executed. This is only mandatory if the Must be supplied option is enabled in the step. Otherwise, the Workflow will be executed with the default data source. Type: String

Executing a Workflow will return an executionID which you can use to check the status of the job.

Retrieve a Job

To retrieve a list of Jobs from Data Studio, you need to specify the tenancyId or tenancy external label.

The return information is:

  • Job execution ID
  • Space where job is executed
  • Job name
  • Job start time
  • Job end time
  • Job execution duration
  • Job error message
  • Job latest message
  • Job status
  • Job step details
    • Job step ID
    • Job step name
    • Job step start time
    • Job step end time
    • Job step duration
    • Job step status
    • Job step progress

Check the status of Jobs

The API allows you to query the status of all recently executed or a specific Job using the executionID.

For each Job, the information returned is the same that's available in the Jobs page in Data Studio:

  • The name of the Workflow
  • Who initiated the Workflow
  • Start and end time
  • Current state
  • Details for each step completed within that Workflow

The current state can be any of the following:

State Description
Executed The REST API has triggered the Job to run. The executionID will be created and the Job eventually picked by the Workflow executor.
Started The Workflow executor has picked up and started the Job.
InProgress The Job is currently running.
Completed The Job has completed successfully without errors.
Failed The Job has failed with an error and terminated.
Stopped The Job has been cancelled.

Retrieve Workflow settings

You can use the API to retrieve the settings of a selected Workflow.

To do that, you have to specify the following fields in your path parameter:

Name Description
Workflow identifier The external label of the Workflow or Workflow ID
Workflow version Workflow's version type

Valid options: PreferDraft, PublishedOnly

PrefreDraft will retrieve the latest draft of the Workflow, if one exists. Alternatively, the published Workflow will be selected. PublishedOnly will retrieve the published version.

Information returned is:

  • The external label of the Workflow
  • Details of each Source step within the Workflow
  • Details of each Workflow parameter
  • Workflow version (draft or published)

You can use the API to retrieve a list of exportable objects and export related metadata.

Retrieve an exportable object

To retrieve a list of exportable objects (draft/published version) in a Space, you need to specify the combination of tenancyId or tenancy external label and spaceId or space external label.

The returned information is:

  • Exportable object type
  • Object details:
    • Object ID
    • Object name
    • Object summary
    • Object status version
    • Object dependencies:
      • Object type
      • Object ID
      • Object name
    • Object force dependencies

Export metadata

To export metadata, you have to use the combination of tenancyId or tenancy external label, spaceId or space external label and object version allowDraftVersions parameter which is defined in the Export properties. The following parameters have to also be supplied.

Parameters Description
description A description for the export that will be included in the exported file.

Type: String
objectType The object type of all of the objects in the list.

Type: String
includeData Indicates that associated data should also be exported for all objects in the list (only applicable for Datasets).

Type: Boolean
id The internal numeric ID for the object (the ID used in the objects URL). The ID of objects can be obtained by running the 'Exportable Objects' operation.

Type: interger

Export a metadata will return a Download file link which you can use to download the .dmx/.dmxd file.

Two audit methods exist: to report information on user sessions and to report all audit events.

Report user sessions

The /sessions endpoint returns a list of all user sessions, within a specified number of seconds or up to a record limit.

The returned list contains:

  • The user ID and Name
  • Start and end date/time of the sessions
  • The environment which was accessed

Report all audit events

The /events endpoint returns a more detailed list of all events recorded by the system. Each event has a date and type and is associated with a user and specific session ID.

You can monitor the state of health of the system and for diagnostic purposes. The metrics endpoints return both Java and application metrics for the entire system across all Environments:

  • The Java metrics track the application JVM resource consumption in terms of process CPU usage, memory usage, active thread count and the garbage collection process.
  • The application metrics (e.g. number of active user sessions, response time to load specific UI pages) can be used to measure and monitor the performance of Data Studio over time.

You can use the API Upload operation to upload a Data Studio metadata file (.dmx or .dmxd) and then import or synchronize the metadata into a Space.

Import metadata

To import metadata, you have to use the combination of tenancyId or tenancy external label and spaceId or space external label parameter which is defined in the Import properties. The following parameters have to also be supplied.

Parameters Description
fileKey The key of an uploaded file. This is obtained by calling the Upload operation. The uploaded file will be removed after a successful import.

Type: String
testOnly If set to 'true', running the import process will only produce a report, but the actual import operation will not be executed. The uploaded file is not removed.

Type: Boolean
rollBackTarget Only applicable for Synchronize operations.

Type: Boolean

The returned information is:

  • Import test mode
  • Number of objects imported
  • Number of errors during import
  • Number of warnings during import
  • General error during import
  • Import object with errors
  • Import object with warning
  • Import object responses:
    • Object type
    • Name
    • Source UUID
    • Target UUID
    • Import action
    • Import action info
    • Errors
    • Object warnings
    • Data warnings
    • Target has draft
    • Target external label
    • Has data

Synchronize metadata

To Synchronize metadata, you have to use the combination of tenancyId or tenancy external label and spaceId or space external label parameter which is defined in the Synchronize properties. The following parameters have to also be supplied.

Parameters Description
fileKey The key of an uploaded file. This is obtained by calling the Upload operation. The uploaded file will be removed after a successful import.

Type: String
testOnly If set to 'true', running the synchronize process will only produce a report, but the actual synchronize operation will not be executed. The uploaded file is not removed.

Type: Boolean
rollBackTarget If set 'true', any changes made to a target object will be overwritten (rolled back) if the target object is a later version (or has a draft version), otherwise the Synchronize will fail with an error.

Type: Boolean

The returned information is:

  • Import test mode
  • Number of objects imported
  • Number of errors during import
  • Number of warnings during import
  • General error during import
  • Import object with errors
  • Import object with warning
  • Import object responses:
    • Object type
    • Name
    • Source UUID
    • Target UUID
    • Import action
    • Import action info
    • Errors
    • Object warnings
    • Data warnings
    • Target has draft
    • Target external label
    • Has data

Query the creation status of a Dataset

Some Datasets are large and can take a significant amount of time to be created in Data Studio when importing from an external source. After using the REST API to create a Dataset, you can query the creation status to determine whether the operation completed successfully.

Aperture Data Studio provides this ability to check the Dataset creation progress and status via an HTTP GET endpoint, using the statusId as the path parameter.

When querying the creation status of a Dataset, the following response information can be expected:

Response information Description
status The current overall creation status of the Dataset when the query was executed.
Possible statuses are: Queued, Initialize, CreateDataset, AutoTagging, Loading, Created, and Error.
The creation status transitions in the following order:
Queued > Initialize > CreateDataset > (If configured) AutoTagging > Loading > Created.
Should an error occur at any point of creation process, the status returned is Error.
statusId The ID of this status response.
tagInfo The column tagging information for the Dataset. If auto-tag is enabled, a tagInfo object is returned, otherwise returns null.
endTime The time of completion of the Dataset creation.
startTime The time when the Dataset creation began.
datasetInfo Contains details about the Dataset, such as rowCount, datasetUuid, and externalLabel. If overall status successfully reaches Created, a datasetInfo object is returned with the relevant information. Returns null if Dataset creation was unsuccessful.
errorMessage Contains details about Dataset creation failure, such as reason and details if Dataset creation was unsuccessful. Returns null if Dataset creation was successful.
phasesProgress Contains details about the Dataset creation tasks, such as progressPercentage, status, and durationMs. durationMs refers to the elapsed time for the phase in milliseconds. A phase is a stage in the Dataset creation process as described in status above.

You can use the REST API to retrieve the list of API key access based on the tenancyid or tenancy external label.

The returned information is:

  • API Key permission list
  • API Key name
  • API Key creation date
  • API Key expiration data
  • API Key status
  • Environment name where API Key is created

You can execute a Schedule and monitor its status through our API. This is particularly useful if you plan to use Schedules to group Workflows together.

Execute a Schedule

To execute a Schedule, you have to specify its external label under the Schedule wizard.

Parameters Description
externalLabel The Schedule's external label.

Type: String
workflows Collection of Workflows, to override Workflow parameters and Source replacements.

Type: Array of JSON Object (refer to the table below)
Workflow parameters Description
externalLabel The Workflow's external label.

Type: String
workflowParameters Key-value pair of parameter to override.

Type: JSON Object
sources Collections of Workflow Parameter in a Key-value pair.

Type: Array of JSON Object

Executing a Schedule through our API will return executionId which you can use to check the execution status of the Schedule.

Check the status of Schedule execution

The API allows you to query the status of all recently executed Schedules using the executionId.

The returned information is:

  • Space where the Schedule is executed
  • Schedule name.
  • Who initiated the Schedule.
  • Start time
  • End time.
  • Overall Schedule execution status.
  • Status for each underlying Workflow:
    • Workflow name
    • Start time
    • End time
    • Execution duration
    • Workflow execution status

The REST API can be used to modify the sharing options of Functions.

Retrieve sharing details for a Function

Return sharing details for a Function, including the sharing mode (global/specific spaces/none) and the Spaces it is shared with.

Parameters Description
spaceId The ID or external label of the Space the Function belongs to.

Type: String
functionId The ID of the Function to be returned.

Type: String

Update Function sharing settings

Update the sharing settings of a Function, including changing the sharing mode or adding/removing Spaces that the Function can be shared with.

Parameters Description
spaceId The ID or external label of the Space the Function belongs to.

Type: String
functionId The ID of the Function to be updated.

Type: String
shareMode The desired sharing mode for the Function. This can be SharedGlobally (shared with all Spaces), SharedSelectively (shared with specific Spaces) or NotShared (shared with no Spaces).

Type: String
spaces (Optional) A list of Space IDs (as integers) that the Function can be shared with.

The REST API can be used to modify the sharing options of charts, as well as include charts into other spaces.

Retrieve sharing details for a chart

Return sharing details for a chart, including the sharing mode (global/specific spaces/none) and the spaces it is shared with.

Parameters Description
spaceId The ID or external label of the space the chart belongs to.

Type: String
chartId The ID of the chart to be returned.

Type: String

Update chart sharing settings

Update the sharing settings of a chart, including changing the sharing mode or adding/removing spaces that the chart can be shared with.

Parameters Description
spaceId The ID or external label of the space the chart belongs to.

Type: String
chartId The ID of the chart to be updated.

Type: String
shareMode The desired sharing mode for the chart. This can be SharedGlobally (shared with all spaces), SharedSelectively (shared with specific spaces) or NotShared (shared with no spaces).

Type: String
spaces (Optional) A list of space IDs (as integers) that the chart can be shared with.

Include/uninclude a chart

Include or remove a chart from a space it has already been shared with.

Parameters Description
spaceId The ID of the space the chart should be included into or removed from.

Type: String
id The ID of the chart to be included/unincluded.

Type: String
include If True, the chart will be included into the space. If False, the chart will be removed from the space.

Type: Boolean
Default value: True
Valid options: True, False

The REST API can be used to modify the sharing options of views, as well as include views into other spaces.

Retrieve sharing details for a view

Return sharing details for a view, including the sharing mode (global/specific spaces/none) and the spaces it is shared with.

Parameters Description
spaceId The ID or external label of the space the view belongs to.

Type: String
viewId The ID of the view to be returned.

Type: String

Update view sharing settings

Update the sharing settings of a view, including changing the sharing mode or adding/removing spaces that the view can be shared with.

Parameters Description
spaceId The ID or external label of the space the view belongs to.

Type: String
viewId The ID of the view to be updated.

Type: String
shareMode The desired sharing mode for the view. This can be SharedGlobally (shared with all spaces), SharedSelectively (shared with specific spaces) or NotShared (shared with no spaces).

Type: String
spaces (Optional) A list of space IDs (as integers) that the view can be shared with.

Include/uninclude a view

Include or remove a view from a space it has already been shared with.

Parameters Description
spaceId The ID of the space the view should be included into or removed from.

Type: String
id The ID of the view to be included/unincluded.

Type: String
include If True, the view will be included into the space. If False, the view will be removed from the space.

Type: Boolean
Default value: True
Valid options: True, False

The REST API can be used to modify the sharing options of Workflows, as well as include Workflows into other Spaces.

Retrieve sharing details for a Workflow

Return sharing details for a Workflow, including the sharing mode (global/specific spaces/none) and the Spaces it is shared with.

Parameters Description
spaceId The ID or external label of the Space the Workflow belongs to.

Type: String
workflowId The ID of the Workflow to be returned.

Type: String

Update Workflow sharing settings

Update the sharing settings of a Workflow, including changing the sharing mode or adding/removing Spaces that the Workflow can be shared with.

Parameters Description
spaceId The ID or external label of the Space the Workflow belongs to.

Type: String
workflowId The ID of the Workflow to be updated.

Type: String
shareMode The required sharing mode for the Workflow. This can be SharedGlobally (shared with all Spaces), SharedSelectively (shared with specific Spaces) or NotShared (shared with no Spaces).

Type: String
spaces (Optional) A list of Space IDs (as integers) that the Workflow can be shared with.

Include/Exclude a Workflow

Include or remove a Workflow from a Space it has already been shared with.

Parameters Description
spaceId The ID of the Space the Workflow should be included into or removed from.

Type: String
id The ID of the Workflow to be included/unincluded.

Type: String
include If True, the Workflow will be included into the Space. If False, the Workflow will be removed from the Space.

Type: Boolean
Default value: True
Valid options: True, False

You can use the API to upload a Data Studio metadata files (.dmx or .dmxd) or any other supported formats:

  • .csv
  • .txt
  • .xls
  • .xlsx
  • .sas7bdat
  • .psv
  • .json

To upload file metadata, you have to first choose the .dmx/.dmxd file then import/synchronize back the metadata.

Uploading a file will return a fileKey which you will need for the import/synchronize or create Dataset operations.

Authentication

Requests to the API are authenticated using an API key which is passed in the Authorization header of each request. To get an API key, you have to log into Data Studio with your username and password first.

Your authorization header should look like this:

{
    "Authorization": "<Environment external label> <Your API Key>"
}

The Environment's external label is used to identify the Environment in which the API key was generated. For users who have been assigned roles in multiple Environments, this information can be found in Data Studio's Manage Environments or Switch Environment pages. The default Environment's external label will be Default but the super admin can change this.

The authorization value is the external label followed by the API key, separated by a space.

API keys

To generate an API key:

  1. Click the user icon in the top right corner and select Manage API keys.
  2. Click Create new API Key.
  3. Enter a name for the key and specify the number of days until it expires.
  4. Select the API permission(s) you want to associate with the key.
  5. Click Generate.
  6. Click Copy to take a copy of the generated key. You will have to paste it in the authorization header of your request.

There are multiple operations available through the Aperture Data Studio API. To add an additional layer of security, selected operations are allowed based on the user capabilities of the user creating the API key and API permissions associated with the API key.

Here are the user capabilities and API permissions required for each operation:

Request method & endpoint Required user capabilities Required API permissions
GET /{tenancyId}/sessions Installation Manager Audit operations (Read)
GET /{tenancyId}/events Installation Manager Audit operations (Read)
Request method & endpoint Required user capabilities Required API permission
GET /{tenancyId}/charts/{spaceId}/{chartId}/sharing Create and Edit Charts Chart operations (Share Charts)
PUT /{tenancyId}/charts/{spaceId}/{chartId}/sharing Create and Edit Charts Chart operations (Share Charts)
PUT /{tenancyId}/charts/{spaceId}/include Create and Edit Charts Chart operations (Share Charts)
Request method & endpoint Required user capabilities Required API permissions
GET /{tenancyId}/datasets View Datasets Dataset operations (Read Dataset metadata)
GET /{tenancyId}/datasets/{spaceId} View Datasets Dataset operations (Read Dataset metadata)
GET /{tenancyId}/datasets/{spaceId}/{datasetId} View Datasets Dataset operations (Read Dataset metadata)
DELETE /{tenancyId}/datasets/{spaceId}/{datasetId} Create and Edit Datasets Dataset operations (Delete)
GET /{tenancyId}/datasets/create/{statusId} Create and Edit Datasets Dataset operations (Read Dataset upload status)
POST /{tenancyId}/datasets/create Create and Edit Datasets Dataset operations (Create)
Request method & endpoint Required user capabilities Required API permissions
POST /{tenancyId}/export/{spaceId} Export Metadata Metadata exchange operations (Export)
Request method & endpoint Required user capabilities Required API permissions
GET /{tenancyId}/exportableobjects/{spaceId} Export Metadata Metadata exchange operations (Export)
Request method & endpoint Required user capabilities Required API permissions
GET /{tenancyId}/functions/{spaceId}/{functionId}/sharing Create and Edit Function Function operations (Share)
PUT /{tenancyId}/functions/{spaceId}/{functionId}/sharing Create and Edit Function Function operations (Share)
Request method & endpoint Required user capabilities Required API permissions
POST /{tenancyId}/import/{spaceId} Import Metadata Metadata exchange operations (Import)
Request method & endpoint Required user capabilities Required API permissions
GET /metrics View System Information Metrics operations (Read)
GET /metrics/list View System Information Metrics operations (Read)
GET /metrics/{name} View System Information Metrics operations (Read)
Request method & endpoint Required user capabilities Required API permissions
GET /{tenancyId}/apiKey API access Rest API key operations (Read keys)
Request method & endpoint Required user capabilities Required API permissions
GET /{tenancyId}/roles Access Designer Interface Security operations (Read)
GET /{tenancyId}/roles/{roleId} Access Designer Interface Security operations (Read)
POST /{tenancyId}/roles Manage Roles and Permissions Security operations (Update)
PUT /{tenancyId}/roles/{roleId} Manage Roles and Permissions Security operations (Update)
PATCH /{tenancyId}/roles/{roleId} Manage Roles and Permissions Security operations (Update)
DELETE /{tenancyId}/roles/{roleId} Manage Roles and Permissions Security operations (Update)
Request method & endpoint Required user capabilities Required API permissions
POST /{tenancyId}/schedule/runnow View Workflows & Execute Workflows Workflow operations (Submit Jobs)
GET /{tenancyId}/schedule/status Monitor Workflow Workflow operations (Read Jobs)
Request method & endpoint Required user capabilities Required API permissions
GET /{tenancyId}/spaces Access Designer Interface Space operations (Read)
GET /{tenancyId}/spaces/{spaceId} Access Designer Interface Space operations (Read)
POST /{tenancyId}/spaces Create and Edit Spaces Space operations (Update)
PUT /{tenancyId}/spaces/{spaceId} Create and Edit Spaces Space operations (Update)
PATCH /{tenancyId}/spaces/{spaceId} Create and Edit Spaces Space operations (Update)
DELETE /{tenancyId}/spaces/{spaceId} Create and Edit Spaces Space operations (Update)
Request method & endpoint Required user capabilities Required API permissions
POST /{tenancyId}/synchronize/{spaceId} Synchronize Metadata Between Environments Metadata exchange operations (Synchronize)
Request method & endpoint Required user capabilities Required API permissions
POST /{tenancyId}/upload Import Metadata & Create and Edit Datasets Metadata exchange operations (Import) & Dataset operations (Create)
Request method & endpoint Required user capabilities Required API permissions
GET /{tenancyId}/users Access Designer Interface Security operations (Read)
GET /{tenancyId}/users/{userId} Access Designer Interface Security operations (Read)
POST /{tenancyId}/users Manage Users Security operations (Update)
PUT /{tenancyId}/users/{userId} Manage Users Security operations (Update)
PATCH /{tenancyId}/users/{userId} Manage Users Security operations (Update)
DELETE /{tenancyId}/users/{userId} Manage Users Security operations (Update)
Request method & endpoint Required user capabilities Required API permissions
GET /{tenancyId}/usergroups Access Designer Interface Security operations (Read)
GET /{tenancyId}/usergroups/{userGroupId} Access Designer Interface Security operations (Read)
POST /{tenancyId}/usergroups Manage User Groups Security operations (Update)
PUT /{tenancyId}/usergroups/{userGroupId} Manage User Groups Security operations (Update)
PATCH /{tenancyId}/usergroups/{userGroupId} Manage User Groups Security operations (Update)
DELETE /{tenancyId}/usergroups/{userGroupId} Manage User Groups Security operations (Update)
Request method & endpoint Required user capabilities Required API permissions
GET /{tenancyId}/views/{spaceId}/{viewId}/sharing Create and Edit Views View operations (Share Views)
PUT /{tenancyId}/views/{spaceId}/{viewId}/sharing Create and Edit Views View operations (Share Views)
PUT /{tenancyId}/views/{spaceId}/include Create and Edit Views View operations (Share Views)
Request method & endpoint Required user capabilities Required API permissions
GET /{tenancyId}/workflows/{spaceId}/{workflowId}/sharing Create and Edit Workflows Workflow operations (Share)
PUT /{tenancyId}/workflows/{spaceId}/{workflowId}/sharing Create and Edit Workflows Workflow operations (Share)
PUT /{tenancyId}/workflows/{spaceId}/include Create and Edit Workflows Workflow operations (Share)
GET /{tenancyId}/workflows/{spaceId}/{workflowId}/settings/{versionType} View Workflows Workflow operations (Read Settings)
Request method & endpoint Required user capabilities Required API permissions
POST /{tenancyId}/jobs View Workflows & Execute Workflows Workflow operations (Submit Jobs)
GET /{tenancyId}/jobs Monitor Workflow Workflow operations (Read Jobs)
GET /{tenancyId}/jobs/{executionId} Monitor Workflow Workflow operations (Read Jobs)

Troubleshooting

If you get a HTTP response with a 401 error, the possible causes are:

Possible cause Error message in HTTP response body
Missing user capabilities The API key must have the following capabilities: {required capabilities}
User (creator of API key) is disabled Unauthorized
Invalid API key Unauthorized
Expired API key Unauthorized
Missing API key Unauthorized
Missing API key permissions Unauthorized
Incorrect environment label Unauthorized
Missing environment label Unauthorized