The Blackhole connector works like /dev/null on Unix: it accepts all input records and discards them silently. Use it to isolate whether a problem in your Flink SQL job comes from the sink configuration or from upstream processing logic.
The Blackhole connector supports both batch mode and streaming mode, and supports data updates and deletions in the sink table.
Use cases
-
Isolate sink configuration errors: If a sink table using another connector type throws an error and you're not sure whether the cause is a system issue or an invalid WITH clause parameter, change
connectortoblackholeand click Validate in the upper-right corner of the Drafts tab. If no error appears, the system is healthy—check your original WITH clause parameters. -
Measure performance overhead: Run a deployment in the RUNNING state with a Blackhole sink to check whether performance consumption occurs when inserting data into a table, without the overhead of a real external system.
-
Validate UDF output: Route user-defined function (UDF) output to a Blackhole sink table to inspect the data without writing to a physical table.
Capabilities
|
Item |
Description |
|
Table type |
Sink table |
|
Running mode |
Batch and streaming |
|
Data format |
N/A |
|
Metric |
N/A |
|
API type |
SQL API |
|
Data update or deletion in a sink table |
Supported |
Limitations
Realtime Compute for Apache Flink using Ververica Runtime (VVR) 2.0.0 or later is required.
Create a Blackhole sink table
Define the sink table with 'connector' = 'blackhole':
CREATE TABLE blackhole_sink(
name VARCHAR,
score BIGINT
) WITH (
'connector' = 'blackhole'
);
To create a Blackhole sink table from an existing source table definition, use the LIKE clause:
CREATE TABLE blackhole_sink WITH ('connector' = 'blackhole')
LIKE table_source (EXCLUDING ALL);
Connector options
|
Option |
Required |
Default |
Type |
Description |
|
|
Yes |
None |
STRING |
The connector type. Set to |
Example
The following example reads from a source table and writes all records to a Blackhole sink:
CREATE TEMPORARY TABLE table_source(
name VARCHAR,
score BIGINT
) WITH (
...
);
CREATE TEMPORARY TABLE blackhole_sink(
name VARCHAR,
score BIGINT
) WITH (
'connector' = 'blackhole'
);
INSERT INTO blackhole_sink SELECT * FROM table_source;