This repository demonstrates a Change Data Capture (CDC) scenario using a MySQL database, Kafka, and a trcli that replicates changes from MySQL to Kafka topics. The setup uses Docker Compose to manage multiple services, making it easy to run the entire application stack locally.
Here's the corresponding Mermaid diagram:
graph LR
subgraph Source
A[MySQL DB]
end
subgraph TRCLI
B[Replication from MySQL]
end
subgraph Messaging
C[Kafka]
end
subgraph User_Interface
D[Kafka UI]
end
A -- CRUD Operations --> B
B -- Replicates Data --> C
C -- Exposes Kafka Topics --> D
classDef source fill:#dff,stroke:#000,stroke-width:2px,rx:5px,ry:5px;
classDef replication fill:#aaf,stroke:#000,stroke-width:2px,rx:5px,ry:5px;
classDef messaging fill:#ffefaa,stroke:#000,stroke-width:2px,rx:5px,ry:5px;
classDef ui fill:#afa,stroke:#000,stroke-width:2px,rx:5px,ry:5px;
class A source
class B replication
class C messaging
class D ui
-
MySQL: A MySQL 8.0 instance is used as the source of data changes.
- Database:
testdb
- User:
testuser
- Password:
testpassword
- Initialization: Data is seeded using
init.sql
.
- Database:
-
Kafka: A Kafka broker (using KRaft mode) that receives messages from the transfer CLI and exposes them as topics.
- Listening Ports:
- 9092 (for client connections)
- 9093 (for internal controller communication)
- Listening Ports:
-
Transfer CLI: A Go-based application that replicates changes from MySQL to Kafka.
- Configuration: Reads changes from MySQL and sends them to Kafka topics prefixed with
cdc
. - Parallel Writers: Utilizes 10 parallel writers to send data to Kafka.
- Configuration: Reads changes from MySQL and sends them to Kafka topics prefixed with
-
Kafka UI: A web-based interface to monitor and manage Kafka topics.
- Access URL: http://localhost:8080
-
Load Generator: A CRUD load generator that performs operations on the MySQL database, which triggers CDC.
- Docker and Docker Compose installed on your machine.
-
Clone the Repository:
git clone https://github.com/doublecloud/transfer cd transfer/examples/mysql2kafka
-
Build and Run the Docker Compose:
docker-compose up --build
-
Access Kafka UI: Open your web browser and navigate to http://localhost:8080 to view and manage Kafka topics.
- Once the Docker containers are running, you can start performing CRUD operations on the MySQL database. The
crud_load_gen
service will simulate these operations. - The
transfer
CLI will listen for changes in the MySQL database and replicate them to Kafka. - You can monitor the changes in Kafka using the Kafka UI.
Here are some example commands you might use to perform CRUD operations on the MySQL database:
-
Insert:
INSERT INTO your_table_name (column1, column2) VALUES ('value1', 'value2');
-
Update:
UPDATE your_table_name SET column1 = 'new_value' WHERE column2 = 'value2';
-
Delete:
DELETE FROM your_table_name WHERE column1 = 'value1';
- Transfer Configuration: The transfer CLI is configured through
transfer.yaml
, which specifies the source (MySQL) and destination (Kafka) settings.
To stop the Docker containers, run:
docker-compose down
This example provides a complete end-to-end CDC solution using MySQL, Kafka, and a Transfer application. You can use it to demonstrate how data can be replicated from a relational database to a streaming platform like Kafka for real-time processing.