# Configure the Kafka sink (inbound) connector

## Overview

This page describes how to configure streaming from Kafka to an Aerospike database.

The Aerospike Kafka sink (inbound) connector reads data from Apache Kafka and writes data to an Aerospike database.

## Configure streaming

To configure streaming from Kafka to Aerospike, set the Kafka sink connector to transform Kafka records into Aerospike records. Store the configuration as `aerospike-kafka-inbound.yml` or `aerospike-kafka-inbound.json` in the `/etc/` directory in your Kafka installation on each Kafka connect node. You can also pass the configuration as a JSON-formatted object. See [Standalone mode](https://aerospike.com/docs/connectors/streaming/kafka/inbound/deploy-kafka-inbound-connector#standalone-mode) for more information.

-   [Version 2.2.1 and later](#tab-panel-1078)
-   [Version 2.2.0 and earlier](#tab-panel-1079)

The configuration has the following options:

| Option | Required | Default | Description |
| --- | --- | --- | --- |
| `max-queued-records` | no | 32768 | Maximum number of records queued up with the connector. The size of the queue can go over this before topics are paused.  
All topics resume after the size of the queue drops under half of the maximum size. |
| `processing-threads` | no | Available processors | Number of threads to use for processing Kafka records and converting them to Aerospike records. |
| [`aerospike`](https://aerospike.com/docs/connectors/streaming/kafka/inbound/configure/aerospike) | yes |  | Configures the connection properties that the connector must use when connecting to your Aerospike database. |
| [`topics`](https://aerospike.com/docs/connectors/streaming/kafka/inbound/configure/topics) | yes |  | Configures the Kafka topics the connector listens to and the transformations to Aerospike records. |

Here is an example:

```yaml
max-queued-records: 10000

aerospike:

  seeds:

    - 192.168.50.1:

        port: 3000

        tls-name: red

    - 192.168.50.2

  cluster-name: east

topics:

  users:

    invalid-record: ignore

    mapping:

      namespace:

        mode: static

        value: users

      set:

        mode: dynamic

        source: value-field

        field-name: city

      key-field:

        source: key

      ttl:

        mode: dynamic

        source: value-field

        field-name: ttl

      bins:

        type: multi-bins

        map:

          name:

            source: value-field

            field-name: firstName
```

The configuration has the following options:

| Option | Required | Default | Description |
| --- | --- | --- | --- |
| [`feature-key-file`](https://aerospike.com/docs/connectors/streaming/kafka/inbound/configure/feature-key-file) | yes for version before 2.2.0  
no for version 2.2.0 and later |  | The location of the feature key file. Not required for versions 2.2.0 and later. |
| [`topics`](https://aerospike.com/docs/connectors/streaming/kafka/inbound/configure/topics) | yes |  | Configures the Kafka topics the connector listens to and the transformations to Aerospike records. |
| [`aerospike`](https://aerospike.com/docs/connectors/streaming/kafka/inbound/configure/aerospike) | yes |  | Configures the connection properties that the connector must use when connecting to your Aerospike database. |

::: caution
`feature-key-file` is deprecated in version 2.2.0 and later. Verify that the `mesg-kafka-connector` feature key is set to `true` in the feature file, and that the feature file is loaded on the Aerospike server. The feature key is read directly from the Aerospike server.
:::

Here is an example:

```yaml
aerospike:

  seeds:

    - 192.168.50.1:

        port: 3000

        tls-name: red

    - 192.168.50.2

  cluster-name: east

topics:

  users:

    invalid-record: ignore

    mapping:

      namespace:

        mode: static

        value: users

      set:

        mode: dynamic

        source: value-field

        field-name: city

      key-field:

        source: key

      ttl:

        mode: dynamic

        source: value-field

        field-name: ttl

      bins:

        type: multi-bins

        map:

          name:

            source: value-field

            field-name: firstName
```