I want to pass input from Django to Kafka. I have fields a,b,c,d and h1,h2 and h3. All fields are input from the user. I want to start my Kafka process as soon as the user enters submit button on my Django front-end interface. All fields are being saved in Mysql.
I can only find possible solutions in Java using Mysql connectors.
Is there any method or connector with which i can pass my input from user on Django interface to kafka using python ?
a,b,c,d - Access Tokens
h1,h2,h3 - Search queries(keywords)
Create a file named consumer1.py with the following python script. KafkaConsumer module is imported from the Kafka library to read data from Kafka. sys module is used here to terminate the script. The same hostname and port number of the producer are used in the script of the consumer to read data from Kafka.
Confluent develops and maintains confluent-kafka-python on GitHub, a Python Client for Apache Kafka® that provides a high-level Producer, Consumer and AdminClient compatible with all Kafka brokers >= v0. 8, Confluent Cloud and Confluent Platform.
Accessing Kafka in Python. There are multiple Python libraries available for usage: Kafka-Python — An open-source community-based library. PyKafka — This library is maintained by Parsly and it’s claimed to be a Pythonic API. Unlike Kafka-Python you can’t create dynamic topics.
Simply put, Kafka is a distributed publish-subscribe messaging system that maintains feeds of messages in partitioned and replicated topics. In the simplest way there are three players in the Kafka ecosystem: producers, topics (run by brokers) and consumers. Producers produce messages to a topic of their choice.
Apache Kafka is a publish-subscribe messaging queue used for real-time streams of data. Apache Kafka lets you send and receive messages between various Microservices. Developing a scalable and reliable Automation Framework for Kafka-based Microservices Projects can be challenging sometimes.
The easiest way to install Kafka is to download binaries and run it. Since it’s based on JVM languages like Scala and Java, you must make sure that you are using Java 7 or greater. Kafka is available in two different flavors: One by Apache foundation and other by Confluent as a package.
First install kafka-python
, then import following statements in views.py
.
from kafka import KafkaProducer
from kafka import KafkaConsumer
from json import loads
import json
import pickle //pickle converts data into byte array
Then write Producer
view as following.
This code converts data into a byte array and send to kafka. Instead of sending json message v you can send your data like v=your data.
def kfk(request):
producer = KafkaProducer(bootstrap_servers='127.0.0.1:9092')
v = {
'msg': {
'hello': 'world',
},
}
serialized_data = pickle.dumps(v, pickle.HIGHEST_PROTOCOL)
producer.send('Ptopic', serialized_data)
return HttpResponse(200)
To consume data:
def cons(request):
consumer = KafkaConsumer('Ptopic',
bootstrap_servers=['localhost:9092'],
api_version=(0, 10)
#,consumer_timeout_ms=1000
)
for message in consumer:
deserialized_data = pickle.loads(message.value)
print(deserialized_data)
Note: Kafka consumer view should always in running mode then try to produce here Ptopic is my topic name.
Yes, just use the KafkaProducer from the kafka-python package and you will be set.
pip install kafka-python
Then in your Django function:
def myfunc(request):
from kafka import KafkaProducer
producer = KafkaProducer(bootstrap_servers='kafkaBroker:9092')
producer.send('foobar', b'test')
return HttpResponse(200)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With