All Products
Search
Document Center

Platform For AI:LangStudio

Last Updated:Nov 04, 2025

Build enterprise-grade LLM applications with LangStudio, a visual platform that simplifies the entire development lifecycle. Get started quickly with built-in components like Knowledge Base, Web Search, and Agent to go from rapid prototyping to one-click production deployment.

Product architecture

image
  • Develop application flows

    • Application flow editor: Use a drag-and-drop canvas to connect components such as LLMs, Python code, and external tools to build an executable application flow.

    • Built-in templates: Start building immediately with out-of-the-box templates for common use cases, such as Knowledge Base Q&A and Natural Language to SQL (NL2SQL), reducing repetitive setup.

    • Connection configuration: Manage authentication credentials and connection parameters for external services, such as databases, APIs, and models, in a centralized location. This lets you configure a connection once and reuse it across multiple applications.

  • Debug and analyze performance: Quickly identify bottlenecks and errors with online debugging and trace tracking to improve application stability and response time.

    • Debug online: Test your application logic in real time. Use the chat window in the development interface to enter prompts and instantly review the output.

    • Analyze traces: Visualize the entire execution path as a trace. To pinpoint sources of latency or logical flaws, examine the input, output, duration, and error logs for each node.

  • Deploy an application flow

    • Deploy with one click: Deploy your application flow to PAI-Elastic Algorithm Service (PAI-EAS) with a single click, which automatically generates a RESTful API with support for authentication and traffic control.

    • Observe online: After deployment, view request details, execution traces, and runtime logs to continuously monitor service health and facilitate troubleshooting.

Benefits

  • Seamless integration with the PAI ecosystem: LangStudio offers an end-to-end solution that covers model invocation, code development, batch evaluation, and elastic deployment, accelerating the entire application lifecycle from Proof of Concept (POC) to production deployment.

  • Production-grade monitoring and tuning: Deep integration with Managed Service for OpenTelemetry and Simple Log Service (SLS) enables comprehensive trace tracking and performance analysis.

  • Enterprise-grade security and stability: LangStudio ensures high availability and security for your applications with features like Virtual Private Cloud (VPC) network isolation, fine-grained role-based access control, and dedicated resource deployment.

Use cases

Build enterprise-grade RAG applications

For specialized domains where knowledge is frequently updated and general-purpose LLMs lack accuracy, you can build a Retrieval-Augmented Generation (RAG) application. By integrating your private enterprise knowledge base, you can improve the relevance and timeliness of the model's responses.

  • Update knowledge dynamically: Use the built-in data synchronization tool to configure scheduled tasks that automatically update the knowledge base index, ensuring the model's answers are always current.

  • Ensure security and privacy: Ensure your corporate data remains secure and isolated during model training and inference, in strict compliance with data privacy regulations and internal audit requirements.

Build an NL2SQL BI assistant

Non-technical users often struggle to write SQL queries. A Natural Language to SQL (NL2SQL) assistant lets users describe their data needs in plain language. The assistant then automatically converts these descriptions into structured SQL statements. This lowers the barrier to entry and accelerates report generation.

  • Generate queries intelligently: Understands user intent from natural language and generates the corresponding SQL statements for your database.

  • Generate reports automatically: Automatically creates charts or data reports based on the query results.

  • Gain data insights and suggestions: Provides intelligent business insights and decision-making suggestions based on historical data analysis.

Build a multimodal chat agent

Build an agent that can process multimodal inputs by integrating models for speech recognition, image understanding, and more. Use the built-in state management and tool-calling nodes to orchestrate complex task flows.

Supported regions

LangStudio is available in the following regions: China (Hangzhou), China (Shanghai), China (Beijing), China (Ulanqab), China (Shenzhen), China (Hong Kong), Singapore, Japan (Tokyo), Germany (Frankfurt), and US (Virginia). To reduce latency, choose a region near your data sources or target users.

Billing

The LangStudio platform itself is free of charge. However, you are billed separately for the underlying cloud services, according to their respective pricing models. These include OSS for storing project files, Managed Service for OpenTelemetry for trace tracking, SLS for log collection, and PAI-EAS for service deployment. For more information, see Billing of LangStudio.

How it works

The following figure shows the application development and deployment process.

image
Note

If you are a fist time user, activate the required cloud services and grant the necessary service role permissions. For more information, see Grant permissions that are required to use LangStudio.

The following steps demonstrate how to create a simple conversational application.

  1. Create a model service connection. On the Connection > Model Service page, click New Connection. Set Connection Type to Alibaba Cloud Model Studio Service. Follow the on-screen instructions to get an API key from Model Studio.

  2. Create an application flow. On the Application Flow page, click Create Application Flow. Set Creation Method to Create by Type and select Chat Type. Select an Object Storage Service (OSS) bucket as the working path to store the application flow's configuration and related files.

  3. Edit the application flow. Go to the application flow details page. Click the LLM node. Select the connection that you created in step 1. Select a model, such as qwen-max.

  4. Attach a runtime and debug. Before debugging, you must attach a runtime to the application flow. The runtime provides the necessary debugging environment.

    1. In the dropdown menu at the top left of the page, select an existing runtime or click New Runtime.

    2. After the runtime is attached, click Run and enter a question in the dialog box. Use the trace and logs below the output to iterate on your application, diagnose issues, and optimize performance.

      image

  5. Deploy to PAI-EAS. In the upper-right corner, click Deploy. Select the appropriate deployment resources and configure the VPC.

    Note

    Alibaba Cloud Model Studio requires public network access, but PAI-EAS services cannot. Therefore, you must enable public network access for PAI-EAS services through a VPC and a NAT Gateway.

  6. Test the service API. After the deployment is successful, you are redirected to the PAI-EAS service details page. On the Online Debugging tab, configure and send a request. For example: {"question":"What is 3+9?"}. For more information, see Call the service.

    Important

    The key in the request body (in this example, question) must match the input parameter name defined in the Start node of the application flow.