This topic describes how to use Qoder, AnalyticDB for PostgreSQL Supabase, and the Qwen Image Edit model to quickly build an AI figurine generation Flutter application without a traditional backend. This solution covers frontend code auto-generation, Backend-as-a-Service (BaaS) configuration, and AI model integration. This guide is for developers who want to quickly validate AI-native application prototypes and achieve agile development.
Overview
Traditional backend architectures are being redefined for native AI applications. This solution uses a lightweight and agile architecture that combines the following core technologies, allowing you to quickly build an AI figurine generation Flutter application without a traditional backend.
Frontend: Qoder automatically generates Flutter code for the user interface and interactions. Qoder is an AI-driven IDE agent that generates high-quality Flutter code based on your requirements. After creating an empty project with the Flutter plugin, you can describe the core features and debug interatively to generate a runnable mobile application.
Backend-as-a-Service (BaaS): AnalyticDB Supabase provides data storage, object storage, and edge function capabilities. This simplifies the complexities of traditional backend development.
AI capability integration: AnalyticDB Supabase edge functions connect to the Qwen Image Edit model to perform image editing.
Prerequisites
You have created a Supabase project.
You have enabled public network access for your AnalyticDB for PostgreSQL Supabase project.
You have obtained a Model Studio API key for calling the Qwen Image Edit model.
Procedure
Step 1: Generate the Flutter application code
Prepare the environment.
Install Qoder.
Create a Flutter project.
In VS Code, use the keyboard shortcut
Command + Shift + P(macOS) orCtrl + Shift + P(Windows/Linux). Then, search for "flutter" and select Flutter: New Project.Generate code using Qoder.
Describe the feature requirements in Qoder AI Chat to auto-generate Flutter code. The following is an example of a feature requirement description:
build a flutter image edit app, powered by supabase, using edge function invoke image model to edit image by uploaded by usersDebug the generated scaffold code. You can proceed with the debugged reference package adb-supabase-flutter-demo.
Step 2: Configure AnalyticDB Supabase
Configure API access.
In the project's root directory, create a
.envfile. Copy the following template into the file and replace the placeholder values with your actual configuration. You can obtain the configuration information by following the instructions in Get API Keys.SUPABASE_URL=https://sbp-xxxxx.supabase.opentrust.net SUPABASE_SERVICE_KEY=xxxxxxxxDesign the database table schema.
Log on to the Supabase Dashboard and create a database table. This table stores records of user-edited images, including key information such as the original image URL, the edited image URL, and the user's prompt.
CREATE TABLE public.edited_images ( id TEXT PRIMARY KEY, prompt TEXT NOT NULL, original_image_url TEXT NOT NULL, edited_image_url TEXT NOT NULL, created_at TIMESTAMPTZ NOT NULL DEFAULT NOW() );Create an object storage bucket.
In the sidebar of the Supabase Dashboard, click Storage.
Create a bucket named images to store user-uploaded image data.
Step 3: Integrate the AI service
Configure secrets.
NoteIn AnalyticDB Supabase, Alibaba Cloud provides native configuration and centralized management for Edge Function Secrets. This feature lets you securely store AI API tokens, such as those for DashScope and Model Studio, in the secret store of the function's runtime environment. You can then retrieve them using
Deno.env.get, which prevents them from being hard-coded or exposed on the client.In the sidebar of the Supabase Dashboard, click Edge Function > Secrets.
Configure
BAILIAN_API_KEYusing the Model Studio API key that you obtained.
Deploy the edge function.
In the sidebar of the Supabase Dashboard, click Edge Function > Functions.
In the upper-right corner of the page, click Deploy a new function and select Via Editor from the drop-down list.
Create and deploy a function named
wan.The following code provides an example. Replace
BASE_URLbased on your network access method:Private network: See Access Alibaba Cloud Model Studio over a private network through an endpoint.
Public network: See Image editing - Qwen.
const DASHSCOPE_API_KEY = Deno.env.get('BAILIAN_API_KEY'); const BASE_URL = 'https://vpc-cn-beijing.dashscope.aliyuncs.com/api/v1'; async function callImageEditAPI(image_url, prompt) { const messages = [ { role: "user", content: [ { image: image_url }, { text: prompt } ] } ]; const payload = { model: "qwen-image-edit", input: { messages }, parameters: { negative_prompt: "", watermark: false } }; try { const response = await fetch(`${BASE_URL}/services/aigc/multimodal-generation/generation`, { method: 'POST', headers: { 'Authorization': `Bearer ${DASHSCOPE_API_KEY}`, 'Content-Type': 'application/json' }, body: JSON.stringify(payload) }); if (!response.ok) { console.error(`Request failed: ${response.status} ${response.statusText}`); return null; } const data = await response.json(); return data.output?.choices?.[0]?.message?.content ?? null; } catch (error) { console.error("Request error:", error.message); return null; } } Deno.serve(async (req)=>{ try { const { image_url, prompt } = await req.json(); if (!image_url || !prompt) { return new Response(JSON.stringify({ error: "Missing image_url or prompt" }), { status: 400, headers: { 'Content-Type': 'application/json' } }); } const result = await callImageEditAPI(image_url, prompt); return new Response(JSON.stringify({ message: result }), { headers: { 'Content-Type': 'application/json', 'Connection': 'keep-alive' } }); } catch (error) { console.error("Server error:", error); return new Response(JSON.stringify({ error: "Internal server error" }), { status: 500, headers: { 'Content-Type': 'application/json' } }); } });
Workflow
Upload original image: After a user selects an image, the frontend uploads it to the images bucket in Supabase Storage and generates a signed URL.
Edit image via Qwen Image Edit model: The frontend sends the signed URL and the editing instruction (prompt) to the edge function. The edge function uses the
BAILIAN_API_KEYto call the Qwen Image Edit model for image editing, and retrieves the URL of the generated image.Insert history record into
edited_imagestable: The frontend writes information, such as the original image URL, the edited image URL, and the prompt, to theedited_imagesdatabase table as a history record.
Test and verify
Run the following commands to install dependencies and start the application.
flutter pub get flutter runAfter the application starts, you can test the AI figurine generation feature on your device or in an emulator.
Example prompt
Draw a 1/7 scale commercial figurine of the character in the image in a realistic style and a real environment. Place the figurine on a computer desk. The computer screen shows the C4D modeling process for this figurine. Next to the computer screen, place a plastic toy box with the original artwork printed on it. On the desk, also include tools for making figurines, such as brushes, paints, and small knives.Test example
Result example

