Actions for Databricks #
- Send Data to Databricks. Sends a custom set of Lucidum data to Databricks.
Use Cases #
Below are the possible use cases for these actions:
-
If you want to run Lucidum “headless”, you can send relevant data to Databricks on a regular schedule.
- You can send normalized, enriched Lucidum data to Databricks to be indexed, searched, and analyzed.
Prerequisites #
To execute Databricks actions, you must, configure a Databricks API connection beforehand.
NOTE. The specified account should have read and write permissions.
Workflows #
- Creating a new Configuration and a new Action
- Cloning an Existing Action
- Creating a new Action from the Location Results page
- Editing a Configuration
- Editing an Action
- Viewing Information about an Action
Databricks Configuration #
To create a configuration for Databricks actions:
-
Configuration Name. Identifier for the Configuration. This name will appear in the Lucidum Action Center.
- Host. The URL of the Databricks account console (http://accounts.cloud.databricks.com>) or the URL of the Databricks workspace (https://{workspace-id}.cloud.databricks.com).
-
HTTP Path. The HTTP Path for the Databricks cluster. For details, see https://docs.databricks.com/aws/en/integrations/compute-details.
- Access Token. The access token for a Databricks account that has read and write access to the Databricks API. For details, see https://docs.databricks.com/aws/en/dev-tools/auth/pat.
-
Max # of Records per Payload. The maximum number of records to send to Databricks in each action. The default value is “50”.
Create a New Action #
To create an action for Databricks, contact Lucidum customer care.