×
Community Blog Solution 1: Build Your Llama2 LLM Solution with PAI-EAS and AnalyticDB for PostgreSQL

Solution 1: Build Your Llama2 LLM Solution with PAI-EAS and AnalyticDB for PostgreSQL

Explore the integration of AnalyticDB for PostgreSQL with large language models on Alibaba Cloud's Compute Nest, empowering businesses with efficiency.

By Haoran

This article introduces how to deploy an enterprise-level AI knowledge base dialogue using AnalyticDB for PostgreSQL and PAI-EAS. It utilizes AnalyticDB for PostgreSQL for retrieval of enterprise-specific knowledge base and PAI-EAS for AI language model inference. These two components are seamlessly

Step 1. Use AnalyticDB for PostgreSQL to Prepare Vector Store

1.  Go to AnalyticDB for PostgreSQL console

2.  Click Create Instance

1

3.  Choose Vector Engine Optimization into Enabled and Specify VPC and VSwitch

2

4.  Choose 4 Core, 16GB, with 2 nodes, and specify the VPC configuration.

3

5.  Create User Account

4

6.  Click Create Account

5

7.  In Database Connections, you can find out the internal endpoint, click Apply for Public Endpoint to enable a public network access.

6

8.  Remember this public entpoint, you will use it later.

7

9.  Go to Security Controls, you can create a whitelist to allow the network access, eg. 0.0.0.0/0

8

Step 2. Use PAI-EAS to Provision LLM Inference Service

1.  Logon to PAI console

https://pai.console.aliyun.com/

2.  Click Create Workspace

9

3.  Input Workspace Name and click Add Member

10

4.  You don’t need to associate MaxCompute or any other resource in the beginning, just click OK.

11

5.  Click Enter Elastic Algorithm Service

12

6.  Choose Deploy Web App by Using Image in Deployment Method, and choose PAI Image chat-llm-webui with version 1.0

13

7.  You can input command to retrieve the LLM model by following commands

LLM Command
Llama2 7B python api/api_server.py --port=8000 --model-path=meta-llama/Llama-2-7b-chat-hf --disable-async
Llama2 13B python api/api_server.py --port=8000 --model-path=meta-llama/Llama-2-13b-chat-hf --precision=fp16
ChatGLM python api/api_server.py --port=8000 --model-path=THUDM/chatglm2-6b
Tongyi Qianwen 7B python api/api_server.py --port=8000 --model-path=Qwen/Qwen-7B-Chat

14

8.  Specify the VPC, vSwitch, and Security Group, for configuration file, add RPC configuration manually.

"rpc": {
"keepalive": 60000,
"worker_threads": 1
}, 

15

After you provision it successfully, you can click the instance id and find out Invocation Method in Service Details, then it has Public Endpoint and Token.

16

You can also use internal VPC in the python notebook for better and secure connection performance

17

9.  Click the view web app to launch llama WEBUI

18

Step 3. Use PAI-DSW and Langchain to Integrate Llama2 and AnalyticDB for PostgreSQL Vector Store

1.  Go to PAI Console

2.  In Interactive Modeling (DSW) panel, click Create Instance

19

Choose GPU P100 spec with ecs.gn5-c4g1.xlarge with 4core CPU.

Note: Failure may occur when choosing NVDIA T4. It is suggested not to choose it now.

20

3.  For ECS Specifications, choose 4 vCPU, 16GB, eg. ecs.g6.xlarge or ecs.g5.xlarge

21

4.  Click Image URL and input the image registry address, and specify the VPC, VSwitch, which is same with your AnalyticDB for PostgreSQL.

Indonesia registry.ap-southeast-5.aliyuncs.com/mybigpai/aigc_apps:1.0
Singapore registry.ap-southeast-1.aliyuncs.com/mybigpai/aigc_apps:1.0
Beijing registry.cn-beijing.aliyuncs.com/mybigpai/aigc_apps:1.0

22

5.  Create VPC and Vswitch

6.  Create Security Group

7.  After it is running, click Launch

23

If you see this warning, please ignore it.

24

8.  Click Flile -> Open Folder

25

Input /home/LLM_Solution/

26

9.  Change url, token, PG_host, PG_user, PG_password, which you configured above.

27

10.  Open a new terminal

28

11.  Input the command in terminal

python main.py --config config.json --upload True 

29

python main.py --config config.json --query "what is machine learning PAI" 

30

Step 4. Re-development Your Code

1.  Check the main.py, and you can change it by python.

31

2.  You can add more knowledge by uploading file under docs folder. Right click docs, click upload, then you can upload a file.

32

Eg. I upload a pdf file, it can also supports.

33

Code Reference:

https://github.com/aigc-apps/LLM_Solution/tree/master

0 1 0
Share on

Farruh

29 posts | 16 followers

You may also like

Comments

Farruh

29 posts | 16 followers

Related Products