Skip to the content

RISHI RAJ S GERA

SVP Edtech Services, Magic Edtech

RISHI RAJ S GERA

SVP Edtech Services, Magic Edtech

  • Home
  • My Profile
    • Know your Consultant
    • Technical Skills
    • My Certifications
  • Expertise
    • Education Advisory Services
    • Digital Transformation
      • Platform Engineering
      • Digital Content – Micro Learning Instruction
  • Resources
    • News and Trends

Amazon SageMaker JumpStart now supports fine-tuning of Foundation Models with domain adaptation

Advanced learning
    • By
    • No Comments on Amazon SageMaker JumpStart now supports fine-tuning of Foundation Models with domain adaptation
    • May 25, 2023

Amazon SageMaker JumpStart now supports fine-tuning of Foundation Models with domain adaptation

Starting today, Amazon SageMaker JumpStart provides the capability to fine-tune a large language model, particularly a text generation model on a domain-specific data set. Customers can now fine-tune models with their custom data set to improve performance in specific domains. For example, this blog describes how to use domain adaption to fine tune a GPT-J 6B model on publicly available financial data from the Security and Exchange Commission so that the model can generate more relevant text for financial services use cases. Customers can fine-tune Foundation Models such as GPT-J 6B and GPT-J 6B FP16 models for domain adaptation on JumpStart inside Amazon SageMaker Studio through UI, and through SageMaker Python SDK.

Share this:

  • Click to share on Twitter (Opens in new window)

Related

Leave a Reply Cancel reply

Follow Blog via Email

Enter your email address to follow this blog and receive notifications of new posts by email.

Generated by Feedzy
Back To Top