Caused by: org.apache.kafka.connect.errors.DataException: JsonConverter with schemas.enable requires “schema” and “payload” fields and may not contain additional fields. If you are trying to deserialize plain JSON data, set schemas.enable=false in your converter configuration.
Hi matriv
As per your suggestion, I have changed the driver but it gives the below exception
[2024-08-02 05:51:26,785] ERROR [cratedb-connector|task-3] WorkerSinkTask{id=cratedb-connector-3} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:196)
org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:611)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:189)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:244)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: org.apache.kafka.connect.errors.ConnectException: java.sql.SQLException: No suitable driver found for jdbc:postgres://localhost:5432/gra?user=crate
at io.confluent.connect.jdbc.util.CachedConnectionProvider.getConnection(CachedConnectionProvider.java:62)
at io.confluent.connect.jdbc.sink.JdbcDbWriter.write(JdbcDbWriter.java:64)
at io.confluent.connect.jdbc.sink.JdbcSinkTask.put(JdbcSinkTask.java:90)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
... 10 more
Caused by: java.sql.SQLException: No suitable driver found for jdbc:postgres://localhost:5432/gra?user=crate
at java.sql/java.sql.DriverManager.getConnection(Unknown Source)
at java.sql/java.sql.DriverManager.getConnection(Unknown Source)
at io.confluent.connect.jdbc.dialect.GenericDatabaseDialect.getConnection(GenericDatabaseDialect.java:256)
at io.confluent.connect.jdbc.util.CachedConnectionProvider.newConnection(CachedConnectionProvider.java:84)
at io.confluent.connect.jdbc.util.CachedConnectionProvider.getConnection(CachedConnectionProvider.java:54)
... 13 more
After adding postgresql-42.7.3.jar it throws the below exception
Caused by: org.apache.kafka.connect.errors.ConnectException: Sink connector ‘cratedb-connector’ is configured with ‘delete.enabled=false’ and ‘pk.mode=kafka’ and therefore requires records with a non-null Struct value and non-null Struct schema, but found record at (topic=‘Mytopic’,partition=0,offset=20,timestamp=1722577031564) with a HashMap value and null value schema.
HI, @Baur Already tried this, but the below error occurred.
Error encountered in task cratedb-connector-0. Executing stage ‘VALUE_CONVERTER’ with class ‘org.apache.kafka.connect.json.JsonConverter’, where consumed record is {topic=‘mytopic’, partition=0, offset=5, timestamp=1722920943610, timestampType=CreateTime}. (org.apache.kafka.connect.runtime.errors.LogReporter:66)
org.apache.kafka.connect.errors.DataException: JsonConverter with schemas.enable requires “schema” and “payload” fields and may not contain additional fields. If you are trying to deserialize plain JSON data, set schemas.enable=false in your converter configuration.
Due nature of our data we can not specify the schema.