Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can i kill distributed worker in Kafka cluster?

I am working with Apache Kafka and using distributed worker. I can start my worker as below:

// Command to start the distributed worker.
"bin/connect-distributed.sh config/connect-distributed.properties"

This is from official documentation. After this we can create connectors and tasks. And this works fine.

But when i change my connector or task logic I should add new jar to classpath of kafka. And after this I should restart worker.

I don't know how it should be right I think we should stop and run worker.

But when I want to stop worker I don't know how i can do it correctly. ofcourse, I can find my process by ps aux | grep worker, kill it and kill rest server which i should find by ps too. But i think it's strange case. Killing two processes isn't good idea, but i can't find any information how we can do it in another way.

If you know right way, please help me:)

Thanks for your time.

like image 351
aarexer Avatar asked Oct 26 '16 09:10

aarexer


1 Answers

Killing two processes isn't good idea

ConnectDistributed is only one process. There is no separate REST server to stop.

And yes, :connector/pause followed by a kill <pid> is the correct way to stop it.

If installed with a recent version of Confluent Platform, you can stop/start using systemctl.

like image 73
OneCricketeer Avatar answered Oct 18 '22 18:10

OneCricketeer