Data Agent for Analytics is an intelligent agent from the Alibaba Cloud ApsaraDB team that interprets your natural language requests, understands your data, proposes new avenues for analysis, and runs tasks to deliver results.
Use cases
Business users: Use plain language to define data requirements and quickly generate reports on sales trends, user behavior, and more.
Data analysts: Perform exploratory analysis, offload time-consuming tasks like automated feature engineering and data cleaning to the agent, and reuse the generated Python code.
Managers: Quickly gain deep insights into key business indicators to support data-driven decision-making.
Features
Automated insight discovery: Describe your requirements in natural language, and the agent autonomously understands the request, interprets the data, discovers insights, and generates a report.
Faster time to value: Focus on business outcomes, not complex data workflows.
Seamless data connection: Natively supports ApsaraDB and DMS-managed data sources, allowing you to start analyzing your data instantly.
Integrations
Core component: Data Agent Skill
One-click integration: Install directly from a Git repository or copy the following command to quickly deploy and load the agent.
npx skills add https://github.com/aliyun/data-agent-skillFeature summary: You can download and seamlessly integrate this skill package into your agent service. It is designed to build a secure bridge to your enterprise data sources. By using standard connection protocols, it significantly enhances the agent's data access capabilities and analytical depth, helping you quickly build custom data intelligence applications.
Limitations
Data Agent for Analytics is currently available only in the following regions: China (Hangzhou), China (Shanghai), China (Shenzhen), China (Chengdu), China (Beijing), China (Zhangjiakou), China (Hong Kong), and Singapore.
Interface layout
Initial interface:
Data Center: Add and upload data for analysis.
All Tasks: View the history of your analysis tasks.
User information: View details of the current account.
Chat area: Start a new analysis task and interact with the agent.
Workspace interface:
Chat area: Continue your conversation with the agent during the analysis.
Execution area: View the agent-written code and its corresponding results for each step.
Procedure
Choose an edition that suits your business needs. For a detailed comparison, see Data Agent editions.
Free Edition:
On the Free version card, click Start Free Trial.
Personal Edition:
On the Personal Edition card, click Upgrade to Personal Edition.
Select a region, the number of seats, and a subscription duration.
NoteIn the Subscription Duration section, click a monthly duration to select the desired period.
Click Buy Now and complete the purchase.
Enterprise Edition:
On the Enterprise Edition card, click Upgrade to Enterprise Edition.
Select a region, the number of Large Language Model (LLM) resource packages, and a subscription duration.
NoteIn the Subscription Duration section, click a monthly duration to select a period from 1 to 11 months.
Click Buy Now and complete the purchase.
Upload a data sample. You can upload a local file or use existing data.
Click
to upload a local file. Files in CSV, XLSX, and XLS format up to 200 MB are supported.Click
to use existing data. You can add data in the Data Center.
Describe your analysis requirements in the chat area and press Enter or click
to start the analysis.Wait for the agent to generate an execution plan. Review the plan.
If the plan is satisfactory, click Start Task.
If not, click Modify Tasks to provide additional details until the plan meets your needs.
Wait for the plan to execute. During execution, you can view the code written by the agent for each step and the corresponding execution result.
The agent can generate a visual web report for a richer presentation of data insights.
Data Center
In the left-side navigation pane, click Data Center. On the page that appears, click Add data and select a connection method.
Local upload:
Drag the file you want to analyze to the upload area, or click the area to select a file. After the upload is complete, click Confirm.
The Free and Personal Editions support RDS Database, PolarDB database, and AnalyticDB Database. The Enterprise Edition supports over 40 data sources. For a complete list, see Databases Supported by DMS.
Select a region and an instance.
Enter the Database user name and Database password.
Click Test link to continue.
Select the database and table.
Click Confirm.
Long-term memory
Long-term memory is a core capability of the agent. During your conversations, the agent automatically identifies and saves key information as long-term memory. This memory is recalled in future sessions to better understand your business requirements, improve analysis results, and provide a more contextual experience.
Manage memory settings
This feature is enabled by default to continuously optimize your analysis experience. If you do not want the system to store memories automatically, you can disable this feature.
Find the Generate memory switch and turn it off.
Memory list
On the Memory page, you can view and manage all stored memories. The list includes the following information:
Field | Description |
Source | The conversation where the memory was generated. You can click the link to go to the original conversation for context. |
Content | The key information that the agent extracted and understood from the conversation. |
Heat | A score based on how frequently a memory is recalled and used. A higher heat value indicates that the memory is more frequently used and more important in your analysis work. |
Status | Memories have two statuses: |
Operation | You can edit or delete memories in the Actions column:
|
Data privacy
The agent does not use your personal data for model training or iteration. This includes uploaded files, query content, analysis results, and generated reports.
Your data is processed in an exclusive computing instance that runs on demand within your account. The instance and its data are destroyed after the task is complete. Other accounts cannot access the data, processes, or history associated with your account.
For analysis based on local files, the original uploaded file is stored in an isolated environment and is not visible to other accounts.
For database analysis, data remains in your database instance. Data is not migrated during the computation process. All intermediate results are destroyed after the analysis completes and are not stored.