Amazon Web Services (AWS) has launched a new service, dubbed Amazon Bedrock, which provides several basic templates designed to allow businesses to customize and create their own Generative AI applications — including programs for general commercial use.
Amazon Bedrock provides users with base models from AI21 Labs, Anthropic, Stability AI, and Amazon, accessible through an API. The service, announced Thursday and now in private preview, comes just a day after Databricks announced its own large open source language model (LLM), Dolly 2.0, and has a similar strategy: to help companies circumvent the constraints of closed-loop models (like ChatGPT) that prevent them from creating their own custom generative AI applications.
Closed-loop trained base models prevent companies from creating any form of generative AI that competes with the original.
This constraint has seen an acceleration of research to create open source models and other alternative generative AI methods that can be targeted for commercial use, as companies demand customizable models for targeted use cases.
Amazon’s Core Models for AI
Amazon foundation templates available through the new service include Amazon’s Titan FMwhich consist of two new LLMs – typically AI models trained on large textual data to generate human-like responses – which were previewed on Thursday and are expected to be made generally available in the coming months.
According to the company, the first Titan foundation model is a generative LLM for tasks such as synthesis, text generation, classification, open-ended questions and answers, and information extraction.
The second is an LLM that translates text inputs (words, sentences, or possibly large units of text) into digital representations (called embeddings) that contain the semantic meaning of the text, the company said.
“Although this LLM does not generate text, it is useful for applications such as personalization and research because by comparing embeddings the model will produce more relevant and contextual responses than word matching,” Swami wrote. Sivasubramanian, vice president of data and machine learning at AWS. in a blog post.
These base templates, the company says, have been tuned to detect and remove harmful content in enterprise data provided to it for personalization, AWS said, adding that the templates can also remove harmful content from outputs.
Amazon Bedrock, according to Sivasubramanian, can be used to find the right model for various use cases and customize it accordingly with enterprise datasets before integrating and deploying it into their applications using the AWS tools.
“Clients simply point Bedrock to a few tagged examples in Amazon S3, and the service can fine-tune the model for a particular task without having to annotate large volumes of data (as little as 20 examples is enough),” Sivasubramanian said.
Train AI models without using business data
“None of the customer’s data is used to train the underlying models, and since all data is encrypted and does not leave a customer’s virtual private cloud (VPC), customers can be confident that their data will remain private and confidential,” Sivasubramanian added.
In addition to the Titan foundation models, Bedrock includes the Jurassic-2 family of multilingual LLMs from AI21 Labs, which follow natural language instructions to generate text in Spanish, French, German, Portuguese, Italian, and Dutch.
Anthropic’s LLM, Claude, also included in Bedrock, can perform a wide variety of conversational and word processing tasks and is based on Anthropic’s extensive research into training honest and accountable AI systems, has declared AWS.
Additionally, Bedrock gets a basic text-to-imagery model in the form of Stability AI’s text-to-image suite.
CodeWhisperer, which is an AI-based code generator, competes with CoPilot from GitHub, owned by Microsoft, which recently added AI features to help developers.
AWS also announced the general availability of Inf2 cloud instances powered by AWS Inferentia2. These instances, according to the company, are “optimized specifically for large-scale generative AI applications with models containing hundreds of billions of parameters.”
Copyright © 2023 IDG Communications, Inc.