Skip to main content

Create Kafka Data Source

NineData supports adding various types and environments of data sources to the console for unified management. You can use database DevOps, backup and recovery, data replication, and database comparison features for data sources that have been added. This article introduces how to add a Kafka source to NineData.

Prerequisites

  • The server IP address of NineData has been added to the data source allowlist. Please refer to the image below for instructions on how to obtain the server IP address.

    server_ip_address

  • Make sure you have available data source quota; otherwise, the data source cannot be added. You can quickly check your remaining quota at the top-right corner of the NineData console.check_quota

Operation Steps

  1. Log in to the NineData Console.

  2. On the left navigation pane, click > .

  3. Click  tab, and click  on the page. In the popup window for selecting the data source type, choose > (the type of data source to be added), and configure the parameters based on the table below on the page.
    tip

    If you make a mistake during the operation, you can click the arrow_down icon at the top of the page to make a new selection.

  1. Configure the parameters of the data source:

    Parameter
    Description
    Enter the name of the data source. To facilitate subsequent search and management, please use a meaningful name.
    Select the access method of the data source. Supports access through , , SSH Tunnel three methods.
    • : Access the data source through a public network address.
    • : A secure and fast intranet access method provided by NineData. First, you need to connect the host where the data source is located. For the connection method, please refer to Add Gateway.
    • SSH Tunnel: Access the data source through an SSH tunnel.
    Select configuration itemsBroker List: A Kafka cluster composed of multiple Broker connection addresses, each Broker is an independent Kafka server instance. You need to enter the public network connection address and port of the Kafka data source below. If there are multiple Brokers, click the button to continue filling in.
    Select configuration items
    • : Select the NineData gateway installed on the host where the data source is located.
    • Broker List: Enter the intranet connection address and port of the Kafka data source. Broker can be written as localhost (the data source is on the local machine) or the intranet IP of the host where the data source is located.
    Select SSH Tunnel configuration items
    • : Enter the public IP or domain name of the server where the target data source is located, as well as the corresponding port number (the default port number for SSH service is 22).
    • : Select the SSH authentication method.
      • : Connect through (i.e., the server's login name) and (i.e., the server's login password).
        • : Enter the login username of the server where the target data source is located.
        • : Enter the login password of the server where the target data source is located.
      • (recommended): Connect through and .
        • : Enter the login username of the server where the target data source is located.
        • : Click to upload the private key file, which is a key file without a suffix. If you have not created one yet, please refer to Generate SSH Tunnel Key File.
        • : Enter the password set when generating the key file. If you did not set a password during the key generation process, leave this field blank.
    • Note: After the SSH configuration is completed, you need to click the on the right, and there may be the following two results:
      • Prompt : Indicates that the SSH Tunnel has been established.
      • Prompt error message: Indicates a connection failure, you need to troubleshoot the cause of the error according to the error message and retry.
    • Broker List: Enter the intranet connection address and port of the Kafka data source. Broker can be written as localhost (the data source is on the local machine) or the intranet IP of the host where the data source is located.
    Select the Kafka authentication method.
    • PLAINTEXT: No authentication information is required, data is transmitted in plaintext, this method is usually suitable for development and testing environments.
    • SASL/PLAIN (default): A SASL-based authentication method that requires a username and password, data is transmitted in plaintext, it is recommended to use this method on encrypted transmission networks.
    • SASL/SCRAM-SHA-256: A SASL-based authentication method that requires a username and password, uses the SHA-256 algorithm to hash the password, providing higher security.
    • SASL/SCRAM-SHA-512: A SASL-based authentication method that requires a username and password, uses the SHA-512 algorithm to hash the password, providing higher security.
    Kafka username.
    Kafka password.
    Select the region closest to your data source location to effectively reduce network latency.
    Choose according to the actual business purpose of the data source, as an environmental identifier for the data source. Default provides and environments, and supports you to create a custom environment.
    Note: Under the organization mode, the database environment can also be applied to permission policy management, for example, the default role only supports access to data sources in the environment and cannot access data sources in other environments. For more information, please refer to Manage Roles.
    Whether to use SSL encryption to access the data source (default off). If the data source enforces SSL encrypted connections, this switch must be turned on, otherwise the connection fails. Click the > to the left of to expand the detailed configuration:
    • Trust Store: The trusted certificate issued by the CA, i.e., the server-side certificate. This certificate is used to verify the identity of the server to ensure that the Kafka service you are connecting to is trusted.
    • Trust Store Password: The password used to protect the Trust Store certificate.
    • Key Store: The certificate used to verify user identity, i.e., the client-side certificate. This certificate ensures that the user connecting to Kafka is a trusted user.
    • Key Store Password: The password used to protect the Key Store certificate.
    For SSL configuration methods, please refer to the official documentation: Configure Kafka Encrypted Connections.
  2. After all configurations are completed, click the next to to test whether the data source can be accessed normally. If prompted with , you can click to complete the addition of the data source. Otherwise, please recheck the connection settings until the connection test is successful.

Appendix: Add NineData's IP Address to Kafka Database Whitelist

When adding data sources located in , you need to add NineData's service IP address to the database whitelist to allow NineData to provide services.

This section takes Kafka version 3.3.2 as an example to introduce how to add an IP whitelist.

  1. Open Kafka's configuration file server.properties, which is usually located at: <Kafka installation directory>/config/server.properties.

  2. Find the listeners parameter and set its value to the IP address and port number that allow access to Kafka. For example, if you want to allow NineData to access Kafka, you can set it to:

    listeners=PLAINTEXT://121.199.39.25:9092
  3. Save the changes and restart the Kafka service.