Articulate LLM

AWS Bedrock Quickstart Guide

Get up and running with Articulate LLM Desktop and AWS Bedrock in just a few minutes.

Estimated Setup Time: 25-40 minutes

Step-by-step visualization of the bot configuration process

Contents

System Requirements

Click for Info
  • Operating System:
  • Windows 10/11 (64-bit)
  • MacOS 10.15 or later
  • Ubuntu 20.04 or later
  • Hardware:
  • Minimum 8GB RAM
  • 4-core CPU
  • 2GB free disk space
  • Network: Stable internet connection
  • AWS Account: Active AWS account with Bedrock access

Prerequisites

  • This guide assumes you have a working knowledge of AWS and the AWS Console

Cost Considerations

Important: AWS Bedrock is a paid service. Costs are based on:

  • Number of input/output tokens
  • Model selection
  • Request volume
  • Data transfer

We recommend:

  • Setting up AWS cost alerts
  • Monitoring usage through AWS Cost Explorer
  • Starting with smaller models for testing
  • Reviewing the AWS Bedrock Pricing Page

AWS Account Setup

AWS Configuration

⏱️ Estimated time: 10 minutes

  1. AWS Account Setup

    • Create or use existing AWS account
  2. IAM Permission Requirements

    • The profile used to authenticate Articulate LLM will at minimum need the following permissions:
    • bedrock:ListInferenceProfiles
    • bedrock:ListFoundationModels
    • bedrock:ListCustomModels
    • bedrock:InvokeModel
    • bedrock:InvokeModelWithResponseStream
  3. AWS CLI Authentication Setup

    • Choose one of the following methods: a. SSO (Preferred):
      • Configure AWS CLI with SSO in the ~/.aws/config file
      • Run 'aws configure sso' and follow the prompts b. Access Keys:
      • Generate Access Key and Secret Key through IAM
      • Store in ~/.aws/credentials file
      • Ensure your chosen method is correctly referenced in the ~/.aws/config file
  4. Choose Your Region

    • Select a region where Bedrock is available:
      • US East (N. Virginia) - us-east-1
      • US West (Oregon) - us-west-2
    • Consider latency requirements for your use case

Pro Tip: Set up separate AWS credentials for development and production environments.


Download and Install

⏱️ Estimated time: 3-5 minutes

Download the Articulate LLM Desktop app for your platform:


Configure Your Bot

⏱️ Estimated time: 5 minutes

AWS Bedrock Setup

Choose Your Inference Provider

  1. Select "AWS Bedrock" from providers list
  2. Enter AWS Profile name
  3. Select primary region
  4. Click "Retrieve Models" to test connectivity
  5. Select a 'default' model to be used for app processes
  6. Click 'Create Endpoint' to move to the next step Set up your bot

Set Up Your Bot

On the bot configuration page:

  1. Enter a name for your bot (e.g., "My Assistant")
  2. Choose a model from the available options
  3. (Optional) Add a system prompt to define your bot's behavior
  4. Click "Create" to finish setup

Pro Tip: Use system prompts to specialize your bot for specific tasks.


Advanced Configuration

Usage Monitoring

  • CloudWatch integration for metrics
  • Cost tracking through tags
  • Optional: enable logging via Bedrock interface (docs)

Performance Optimization

  • Utilize Cross-region inference for high throughput
  • Region selection for latency optimization

Note: For detailed Bedrock configuration, refer to AWS Documentation.


Next Steps

Immediate Actions

  • Test your configuration
  • Experiment with different models
  • Monitor initial usage

Learning Resources

Troubleshooting

Common issues and solutions:

IssueSolution
Authentication FailedVerify AWS credentials and permissions
Region ErrorConfirm Bedrock availability in selected region
Quota LimitsRequest quota increase from AWS
Model AccessEnsure model access is enabled in Bedrock console
Connection TimeoutCheck network connectivity and AWS endpoints

For help:


📝 Last updated: 2025-01-01
📦 Version: 1.0.0