All Products
Search
Document Center

Single instance multi concurrent

Last Updated: Mar 06, 2020

The charges for Function Compute V1.0 are calculated based on the total time consumed for request execution. This billing method is not applicable to functions that require input/output operations such as accessing a database.

  1. exports.handler = (event, context, callback) {
  2. // time = 00:00:00
  3. db.getData(key, (result) => {
  4. // time = 00:00:10
  5. callback(null, result);
  6. });
  7. };

If you use a function to process three requests and the latency to access a database is 10 seconds, then the total duration of the execution is 30 seconds.

Function Compute V2.0 offers an updated pricing method. The charges calculated are based on the execution duration of instances. For the aforementioned scenario, this method is more cost-effective because the three requests are processed concurrently in one instance, which consumes only 10 seconds. The feature of concurrently processing requests in one instance is designed for scenarios like this. You specify the InstanceConcurrency parameter of the feature. This parameter limits the maximum number of requests an instance can concurrently process. The following figure shows the differences between functions whose instances can process one request and instances that can process multiple requests.

The feature of concurrently processing requests in one instance has the following benefits:

  1. Saves cost on the execution duration: Functions that require input/output operations can process multiple requests in one instance to minimize the execution duration of instances.
  2. Provides shared resources for requests: Multiple requests in one instance can share the connection pool of the database to minimize the connections between requests and the database.
  3. Reduces the frequency of cold starts: The request quantity to create instances is reduced because multiple requests can now be processed in one instance, which reduces the frequency of cold starts.
  4. Reduces the number of IP addresses used in a VPC: When processing a fixed number of requests, each instance can handle multiple requests. This reduces the number of occupied instances, and thus reducing the number of addresses in the VPC.

By default, the value of InstanceConcurrency is 1. This indicates that an instance can process only one request at a time. After you have specified InstanceConcurrency, Function Compute will make full use of all existing instances before creating new instances. Note that this feature is not applicable to all functions. The following table lists the scenarios.

Scenario Applicable Reason
Requests are waiting for responses from the downstream service for an extended period of time Yes Resources are generally not consumed when requests are waiting for responses. Requests can be processed in a single instance to save costs.
Requests are using a shared state that cannot be accessed concurrently No Concurrently modifying the shared state such as context variables may cause errors.
A request consumes a large amount of CPU and memory resources No Multiple requests compete for resources, which leads to insufficient memory or longer latency.

Specify the instance concurrency

You can specify the InstanceConcurrency parameter when you create or update a function. You can also view the concurrency on the information page of the function.

If you enable the reserved instance feature, both reserved and pay-as-you-go instances can concurrently process requests.

Specify the concurrency through the console

You can specify the instance concurrency on the creation or configuration page of a function, as shown in the following figure.

Configure the instance concurrency

Use the SDK or API to specify instance concurrency

The following code uses the Node.js SDK as an example to specify the instance concurrency of a function:

  1. // create function
  2. var resp = await client.createFunction(serviceName, {
  3. functionName: funcName,
  4. handler: 'counter.handler',
  5. memorySize: 512,
  6. runtime: 'nodejs10',
  7. code: {
  8. zipFile: fs.readFileSync('/tmp/counter.zip', 'base64'),
  9. },
  10. instanceConcurrency: 10,
  11. });
  12. // update function
  13. var resp = await client.updateFunction(serviceName, funcName, {
  14. instanceConcurrency: 20,
  15. });
  16. // get function
  17. var resp = await client.getFunction(serviceName, funcName);

Differences after the instance concurrency is specified

After you have specified the instance concurrency from 1 to a value greater than 1, the following aspects are different:

Pricing

When the instance concurrently processes multiple requests, Function Compute calculates charges based on the execution duration of an instance. This duration begins when the first request starts, and ends when the last request is complete.

For more information, see the “Execution duration cost” section in Billing methods.

Concurrent request limit

By default, Function Compute allows a maximum concurrency of 100. The ResourceExhausted error is returned when the number of requests exceeds 100. When you enable concurrently processing requests in one instance, the maximum concurrency indicates the maximum number of instances allowed. For example, if InstanceConcurrency is set to 10, a maximum of 1,000 concurrent requests are allowed.

Log

  1. When an instance processes one request, if you specify X-Fc-Log-Type: Tail in the HTTP header, Function Compute returns the function logs in the X-Fc-Log-Result field that is in the response header. When an instance processes multiple requests, the response header does not include function logs because the logs of a specific request cannot be obtained among concurrent requests.
  2. For the Node.js runtime, the console.info() function is used to return the ID of the current request in the log. When an instance processes multiple requests, the console.info() function cannot display the correct IDs of all the requests. Therefore, the context.logger.info() function is used to display logs. The following code provides an example.
  1. exports.handler = (event, context, callback) => {
  2. console.info('logger begin');
  3. context.logger.info('ctxlogger begin');
  4. setTimeout(function() {
  5. context.logger.info('ctxlogger end');
  6. console.info('logger end');
  7. callback(null, 'hello world');
  8. }, 3000);
  9. };

The returned logs are as follows. Note that the request ID returned by the console.info() function is changed to req2 while the context.logger.info() returns the correct request ID.

  1. 2019-11-06T14:23:37.587Z req1 [info] logger begin
  2. 2019-11-06T14:23:37.587Z req1 [info] ctxlogger begin
  3. 2019-11-06T14:23:37.587Z req2 [info] logger begin
  4. 2019-11-06T14:23:37.587Z req2 [info] ctxlogger begin
  5. 2019-11-06T14:23:40.587Z req1 [info] ctxlogger end
  6. 2019-11-06T14:23:40.587Z req2 [info] ctxlogger end
  7. 2019-11-06T14:23:37.587Z req2 [info] logger end
  8. 2019-11-06T14:23:37.587Z req2 [info] logger end

Error handling

When you use an instance to process multiple requests, process crashes or unexpected exits caused by failed requests affect other concurrent requests. Therefore, you must compile troubleshooting logic to avoid impacts on other requests. The following code provides an example of troubleshooting exceptions by using Node.js.

  1. exports.handler = (event, context, callback) => {
  2. try {
  3. JSON.parse(event);
  4. } catch (ex) {
  5. callback(ex);
  6. }
  7. callback(null, 'hello world');
  8. };

Shared variables

When multiple requests are processed by an instance, errors may occur because multiple requests attempt to modify the same variable at the same time. You must use the mutual exclusion method to avoid variable modifications that are not safe for threads when writing your functions. The following code provides an example of mutual exclusion by using Java.

  1. public class App implements StreamRequestHandler
  2. {
  3. private static int counter = 0;
  4. @Override
  5. public void handleRequest(InputStream inputStream, OutputStream outputStream, Context context) throws IOException {
  6. synchronized (this) {
  7. counter = counter + 1;
  8. }
  9. outputStream.write(new String("hello world").getBytes());
  10. }
  11. }

Monitoring metrics

After you have specified the instance concurrency for your function, you can view from the monitoring chart of the function that the number of used instances is reduced.

Item Value
Supported runtime Node.js 6, Node.js 8, Node.js 10, Java 8, and custom runtimes
Value range of InstanceConcurrency Integer from 1 to 100
Returns function logs in the X-Fc-Log-Result field in the response header Not supported when InstanceConcurrency is set to a value greater than 1