Organization
User can setup organization level configurations for usage across all the resources within an organization. The following is a comprehensive list of configurations for an organization.
Organization Information
Organization Name: Enter the desired name for your organization.
Business Name: Specify the name of your business.
Domain: Provide the domain associated with your organization.
Type: Karini or Legal (Deprecated)
Credentials
All credentials are encrypted and secured in Karini AI's vault.
Setup AWS Credentials
AWS account ID:
Add your 12 digit valid one or more AWS account IDs. You can link more than one AWS account
In your AWS Account, create a Cross Account IAM role with external ID. Locate your Karini AI Organization ID visible in right hand top corner when you are in Edit Organization page. You must use your Karini AI Organization ID as the external ID in your IAM role. Follow AWS documentation for details to create as IAM role.
Permission Policy:
Attach an inline policy with permissions to selective services, resources and actions. Refer to AWS least-privilege permission for security best practices. Following is a sample policy template for Karini AI to access respective resources in your AWS account. However, you can further restrict the policy as per your needs.
Trust Policy:
Attach a trust policy to allow Karini AI principal to assume role under the predefined conditions for your external Id. Following is a sample trust policy.
Note: Contact Karini AI for your trust policy configuration.
AWS Credentials:
Add a default Global role used to access resources in your AWS account. The role can be overriden in respective model hub or connector pages in case you have more restrictive role.
AWS IAM Role ARN: This is a unique identifier for an AWS resource. Provide AWS IAM Role ARN, refer IAM Roles Overview for more details.
AWS External ID: This is a unique identifier that third parties use when assuming roles in your account. This is read only field and set to your Karini AI organization ID refer Using External ID for more details.
AWS default region: Select your AWS region from the dropdown
Model Provider Credentials
OpenAI Credentials:
OpenAI Key: OpenAI API key to access the registered OpenAI models.
Azure OpenAI Credentials:
Azure OpenAI Key: Azure OpenAI API key to access the registered Azure OpenAI models.
Anyscale:
Anyscale API Key: The unique authentication token required to access Anyscale services and resources.
Anyscale API Base: The endpoint where Anyscale services are hosted, facilitating communication between client applications and the Anyscale platform.
Reranker LLM Credentials
Cohere API Key: A unique authentication key provided by Cohere, used to access the cohere reranker model during embeddings retrieval process.
Data connector Provider Credentials
Azure Cloud Credentials
Azure Account Name: The unique identifier associated with your Azure account.
Azure Account Key: A secret authentication key required to access Azure services and resources securely.
Confluence Credentials
Confluence Account Name: Username or account identifier associated with the Confluence account.
Confluence Key: The unique identifier or token assigned to the Confluence account for authentication purposes.
Confluence Product URL: The web address or URL of the Confluence product where the account is hosted, used for accessing Confluence services.
Google Cloud Services Credentials
Credentials Json: Json credentials refer to authentication information stored in JSON format, typically containing essential details such as client ID, client secret, and other necessary credentials required for authenticating access to a service or platform.
Box Credentials
Box Credentials: Paste credentials JSON containing "client_id", "client_secret", "access_token", "refresh_token".
Google Drive Credentials
Credentials JSON: JSON credentials refer to authentication information stored in JSON format, typically containing essential details such as client ID, client secret, and other necessary credentials required for authenticating access to a service or platform.
Dropbox Credentials
Dropbox Token: Enter token, refesh token, client id and client secret authentication code allowing secure access to Dropbox accounts.
Databricks Runtime Credentials
Databricks Credentials
Credentials for Databricks Workspace.
Databricks Host URL: The URL of your Databricks workspace
Databricks API Token: The unique identifier that grants access to Databricks API endpoints. It's used for authentication when making requests to the Databricks API.
Databricks AWS IAM Instance Role: The EC2 instance role that grants permission to AWS services with sts:assumerole policy. This role is used by Databricks to launch the job cluster. For details about creating this role, refer to Databricks prerequisites.
Databricks HTTP Path: Connection details for Databricks SQL Warehouse.
Databricks Cluster ID [optional]: The unique identifier assigned to your Databricks cluster. It allows you to specify which cluster your job or task should run on within your Databricks workspace.
Global Default Model Endpoints
Intent Detector LLM
LLM that detects intent in chatbots to classify them as specific or not specific. The prompt for intent detection can be mofied on recipe's output tile
Intent Detector LLM endpoint: Select LLM model endpoint from the registered model endpoints list in the UI.
Followup Questions Generator
LLM to generate follow-up questions in chatbot based on conversation history.
Followup questions generator model endpoint: Select LLM model endpoint from the registered model endpoints list in the UI.
Natural Language Assistant
LLM to assist in a variety of natural language processing tasks, including text and JSON schema generation, as well as query rewriting and expansion.
Natural Language Assistant model endpoint: Select LLM endpoint from the registered model endpoints list in the UI.
Global Embeddings Model
The LLM is used to generate the Catalog Vector index. Embedding LLMs with dimension of 1536 are supported.
Global Embeddings model endpoint: Select Global Embeddings model endpoint from the registered embedding model endpoints list in the UI
Default Guardrail
Karin AI supports Amazon Bedrock guardrails, The default guardrails is used as default for supported models and model providers. User can override the guardrails with more specific guardrail endpoint
Global Embeddings model endpoint: Select the appropriate guardrails from the list
Speech to Text
Karini AI supports Audio mode for chatbots that requires Speech to Text model. Ensure model endpoint is added to Karini AI model hub
Speech to Text model endpoint: Select the one of the available speech to text model endpoint
Text to Speech
Karini AI supports Audio mode for chatbots that requires Text to Speech model. Ensure model endpoint is added to Karini AI model hub
Text to Speech model endpoint: Select the one of the available text to speech model endpoint
Custom metadata extraction model
Metadata extraction model used for custom metadata extraction prompt
Custom metadata extraction model endpoint: Select a model endpoint from the registered model endpoints list in the UI.
Last updated