AI21 Labs’ Jamba 1.5 Models Join Amazon Bedrock Platform

NewsAI21 Labs' Jamba 1.5 Models Join Amazon Bedrock Platform

Introducing AI21 Labs’ Jamba 1.5 Models on Amazon Bedrock

Today, we are excited to announce the launch of AI21 Labs’ latest Jamba 1.5 family of large language models (LLMs) on Amazon Bedrock. These new models are a significant leap in long-context language capabilities, bringing enhanced speed, efficiency, and performance to a wide array of applications. The Jamba 1.5 family includes two variations: Jamba 1.5 Mini and Jamba 1.5 Large. Both models come with a 256K token context window, support for structured JSON output, function calling capabilities, and the ability to process document objects.

AI21 Labs is renowned for its expertise in developing foundational models and artificial intelligence (AI) systems tailored for enterprise needs. Through a strategic collaboration with AWS, AI21 Labs aims to empower customers across various industries to create, deploy, and scale generative AI applications. These applications address real-world challenges and drive innovation. By combining AI21 Labs’ cutting-edge models with Amazon’s robust services and infrastructure, customers can utilize LLMs in a secure environment, shaping the future of information processing, communication, and learning.

What is Jamba 1.5?

The Jamba 1.5 models feature a unique hybrid architecture that merges the traditional transformer model architecture with Structured State Space model (SSM) technology. This innovative approach enables Jamba 1.5 models to handle long context windows of up to 256K tokens while maintaining the high-performance characteristics of transformer models. For a deeper understanding of this hybrid SSM/transformer architecture, you can refer to the [Jamba: A Hybrid Transformer-Mamba Language Model](https://arxiv.org/pdf/2403.19887) whitepaper.

New Jamba 1.5 Models in Amazon Bedrock

Two new Jamba 1.5 models are now available on Amazon Bedrock:

  • Jamba 1.5 Large: This model excels at complex reasoning tasks across all prompt lengths, making it ideal for applications that require high-quality outputs on both long and short inputs.
  • Jamba 1.5 Mini: Optimized for low-latency processing of long prompts, this model enables fast analysis of lengthy documents and data.

Key Strengths of Jamba 1.5 Models

The Jamba 1.5 models offer several key advantages:

  • Long Context Handling: With a 256K token context length, Jamba 1.5 models can enhance enterprise applications such as lengthy document summarization and analysis, as well as agentic and retrieval-augmented generation (RAG) workflows.
  • Multilingual Support: These models support multiple languages, including English, Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew.
  • Developer-Friendly: The models natively support structured JSON output, function calling, and can process document objects.
  • Speed and Efficiency: AI21 Labs has reported that Jamba 1.5 models demonstrate up to 2.5 times faster inference on long contexts compared to other models of similar sizes. For detailed performance results, visit the [Jamba model family announcement on the AI21 website](https://www.ai21.com/blog/announcing-jamba-model-family).

Getting Started with Jamba 1.5 Models in Amazon Bedrock

To begin using the new Jamba 1.5 models, navigate to the [Amazon Bedrock console](https://console.aws.amazon.com/bedrock), select Model access from the bottom left pane, and request access to either Jamba 1.5 Mini or Jamba 1.5 Large.

For testing the Jamba 1.5 models in the Amazon Bedrock console, choose the Text or Chat playground from the left menu pane. Then, select AI21 as the category and choose either Jamba 1.5 Mini or Jamba 1.5 Large as the model.

By selecting View API request, you can obtain a code example of how to invoke the model using the [AWS Command Line Interface (AWS CLI)](https://aws.amazon.com/cli/) with the current example prompt.

You can follow the [code examples in the Amazon Bedrock documentation](https://docs.aws.amazon.com/bedrock/latest/userguide/service_code_examples_bedrock-runtime.html) to access available models using [AWS SDKs](https://aws.amazon.com/developer/tools/) and build your applications using various programming languages.

Here’s a Python code example demonstrating how to send a text message to Jamba 1.5 models using the Amazon Bedrock Converse API for text generation:

“`python
import boto3
from botocore.exceptions import ClientError

# Create a Bedrock Runtime client.
bedrock_runtime = boto3.client(“bedrock-runtime”, region_name=”us-east-1″)

# Set the model ID.
# modelId = “ai21.jamba-1-5-mini-v1:0”
model_id = “ai21.jamba-1-5-large-v1:0”

# Start a conversation with the user message.
user_message = “What are 3 fun facts about mambas?”
conversation = [
{
“role”: “user”,
“content”: [{“text”: user_message}],
}
]

try:
# Send the message to the model, using a basic inference configuration.
response = bedrock_runtime.converse(
modelId=model_id,
messages=conversation,
inferenceConfig={“maxTokens”: 256, “temperature”: 0.7, “topP”: 0.8},
)

# Extract and print the response text.
response_text = response[“output”][“message”][“content”][0][“text”]
print(response_text)

except (ClientError, Exception) as e:
print(f”ERROR: Can’t invoke ‘{model_id}’. Reason: {e}”)
exit(1)
“`

The Jamba 1.5 models are ideal for use cases such as paired document analysis, compliance analysis, and answering questions from long documents. They can easily compare information across multiple sources, ensure passages meet specific guidelines, and handle very long or complex documents. You can find example code in the [AI21-on-AWS GitHub repo](https://github.com/aws-samples/AI21-on-AWS). For tips on effectively prompting Jamba models, check out [AI21’s documentation](https://docs.ai21.com/docs/prompt-engineering).

Now Available

The Jamba 1.5 family of models from AI21 Labs is now generally available on Amazon Bedrock in the US East (N. Virginia) [AWS Region](https://aws.amazon.com/about-aws/global-infrastructure/regions_az/). Check the [full Region list](https://docs.aws.amazon.com/bedrock/latest/userguide/models-regions.html) for future updates. For more information, visit the [AI21 Labs in Amazon Bedrock](https://aws.amazon.com/bedrock/ai21/) product page and the [pricing page](https://aws.amazon.com/bedrock/pricing/).

Give the Jamba 1.5 models a try in the [Amazon Bedrock console](https://console.aws.amazon.com/bedrock) today and send your feedback to [AWS re:Post for Amazon Bedrock](https://repost.aws/tags/TAQeKlaPaNRQ2tWB6P7KrMag/amazon-bedrock) or through your usual AWS Support contacts.

Visit our [community.aws](https://community.aws/generative-ai?trk=5aadd1a3-56ee-4a3b-9314-21a4f0e684ed&sc_channel=el) site to find in-depth technical content and discover how our Builder communities are utilizing Amazon Bedrock in their solutions.
For more Information, Refer to this article.

Neil S
Neil S
Neil is a highly qualified Technical Writer with an M.Sc(IT) degree and an impressive range of IT and Support certifications including MCSE, CCNA, ACA(Adobe Certified Associates), and PG Dip (IT). With over 10 years of hands-on experience as an IT support engineer across Windows, Mac, iOS, and Linux Server platforms, Neil possesses the expertise to create comprehensive and user-friendly documentation that simplifies complex technical concepts for a wide audience.
Watch & Subscribe Our YouTube Channel
YouTube Subscribe Button

Latest From Hawkdive

You May like these Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.