This example combines a kafka topic, a postgres database, the csv format and a rest endpoint, to simulate account transactions that will be queried through Rest.
- A running Kafka cluster
- A running Postgresql database
- A database named
transactions
- A table named
transactions
with the following structureColumn name Data type id serial timestamp varchar account varchar amount float
CREATE TABLE transactions (
"timestamp" character varying,
account character varying,
amount bigint,
id integer NOT NULL
);
CREATE SEQUENCE transactions_id_seq
AS integer
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
ALTER TABLE ONLY transactions ALTER COLUMN id SET DEFAULT nextval('public.transactions_id_seq'::regclass);
The load is generated using the load-generator.camel.yaml file, which creates CSV-like records and submits it into a kafka topic.
Later on, the data-ingestion.camel.yaml file consume messages from that topic and stores it into a Postgres database.
- Open the
data-ingestion.camel.yaml
file and set thePostgresqlDataSource
bean properties to match your Postgresql database, such asdatabaseName
,user
,password
,serverName
, andportNumber
1.1. This example expects to have a secret namedpostgres
with the following keys: *database-name
containing the database name *database-user
containing the user name *database-password
containing the password - Since this example requires the Postgresql dependency, change to the
account-transactions
folder
cd account-transactions
- Run the integration using the Camel CLI extension, or by executing the following command:
jbang '-Dcamel.jbang.version=4.5.0' camel@apache/camel run * --dev --logging-level=info
- In another terminal, execute the following command to retrieve records
curl localhost:8080