I have a kafka broker running in my local system. To communicate with the broken using my Django based web application I am using confluent-kafka wrapper. However by browsing through the admin api, I could not find out any api for listing kafka topics. (The topics are created pragmatically and are dynamic).
Is there any way I could list them within my program?
The requirement is that, if my worker reboots all assigned consumers listening to those topics will have to be re-initialized so I want to loop to all the topics and assign a consumer to each.
Here is how to do it:
>>> from confluent_kafka.admin import AdminClient
>>> conf = {'bootstrap.servers': 'vps01:9092,vps02:9092,vps03:9092'}
>>> kadmin = AdminClient(conf)
>>> kadmin.list_topics().topics # Returns a dict(). See example below.
{'topic01': TopicMetadata(topic01, 3 partitions),}
I hope that helps.
Two ways you can achieve it:
from confluent_kafka import Consumer
consummer_config = {
'bootstrap.servers': 'localhost:9092',
'group.id': 'group1'
}
consumer = Consumer(consummer_config)
consumer.list_topics().topics
This will return a dictionary something like this:
{'raw_request': TopicMetadata(raw_request, 4 partitions),
'raw_requests': TopicMetadata(raw_requests, 1 partitions),
'idw': TopicMetadata(idw, 1 partitions),
'__consumer_offsets': TopicMetadata(__consumer_offsets, 50 partitions)}
the other way is already posted by #NYCeyes
Try searching again for list_topics
https://docs.confluent.io/5.2.1/clients/confluent-kafka-python/index.html#confluent_kafka.Consumer.list_topics
https://docs.confluent.io/5.2.1/clients/confluent-kafka-python/index.html#confluent_kafka.Producer.list_topics
Related
I am trying to write a script to import messages to a Uniform Distributed queue in Weblogic using WLST but I am unable to find a solution that specifically caters to my requirement.
Let me explain the requirement:
I have error queues that store failed messages. I have exported them as an xml file (using WLST) and segregated them based on the different error code in message header into smaller xml files which need to be imported into the main queue for reprocessing(not using Admin console).
I am sure that there is something that can be done to achieve this as I am able to import the segregated xml files using the import option in Admin console which works like a charm but have no idea how it is actually being done so that it could be implemented as a script.
I have explored a few options like exporting the files as a binary SER file which works but it is not something that can be used to filter out the retryable messages only.
The wlst method importMessages() only accepts a composite datatype array. Any method to convert/create the required composite Datatype array from the xml files would also be a great solution to the issue.
I agree it is not very simple and intuitive.
You have 2 solutions :
pure WLST code
java code using JMS API
If you want to write pure WLST code here is a code sample that will help you. The code creates and publish n messages into a queue.
The buildJMSMessage() function is responsible to create a text message.
from javax.management.openmbean import CompositeData
from weblogic.jms.extensions import JMSMessageInfo, JMSMessageFactoryImpl
...
def buildJMSMessage(text):
handle = 1
state = 1
XidString = None
sequenceNumber = 1
consumerID = None
wlmessage = JMSMessageFactoryImpl.getFactory().createTextMessage(text)
destinationName = ""
bodyIncluded = True
msg = JMSMessageInfo(handle, state, XidString, sequenceNumber, consumerID, wlmessage, destinationName, bodyIncluded)
return msg
....
quanity = 10
messages = jarray.zeros(quantity,CompositeData)
for i in range(0,quantity):
messages[i] = buildJMSMessage('Test message #'+str(i)).toCompositeData()
i = i + 1
queue.importMessages(messages, False)
kafka-python contains multiple modules to create/delete topic and also pass multiple configuration while doing so.
Is there a way to add additional configuration to following method -
NewTopic(name=topicname, num_partitions=1, replication_factor=1)
Yes it's possible to create a compacted topic with kafka-python.
The NewTopic constructor accepts a topic_configs argument to specify the topic configurations.
For example:
from kafka import KafkaAdminClient
from kafka.admin import NewTopic
admin = KafkaAdminClient(bootstrap_servers=['localhost:9092'])
topic = NewTopic('bar', 1, 1, topic_configs={'cleanup.policy': 'compact'})
response = admin.create_topics([topic])
print(response)
I am getting started with Kafka and fairly new to Python. I am using this library named kafka-python to communicate with my Kafka broker. Now I need to dynamically create a topic from my code, from the docs what I see is I can call create_topics() method to do so, however I am not sure, how will I get an instance of this class. I am not able to understand this from the docs.
Can some one help me with this?
You first need to create an instance of KafkaAdminClient. The following should do the trick for you:
from kafka.admin import KafkaAdminClient, NewTopic
admin_client = KafkaAdminClient(
bootstrap_servers="localhost:9092",
client_id='test'
)
topic_list = [NewTopic(name="example_topic", num_partitions=1, replication_factor=1)]
admin_client.create_topics(new_topics=topic_list, validate_only=False)
Alternatively, you can use confluent_kafka client which is a lightweight wrapper around librdkafka:
from confluent_kafka.admin import AdminClient, NewTopic
admin_client = AdminClient({"bootstrap_servers": "localhost:9092"})
topic_list = [NewTopic("example_topic", 1, 1)]
admin_client.create_topics(topic_list)
I'm trying to use curator python API to periodically delete old logs in elasticsearch.
In my code output, it seems I can't filter the indices I want which stuck me several days.
Anyone can help to have a look if I have done anything wrong?
I used elasticsearch-curator version 5.4.1 and tested elasticsearch 5.5 on EC2, aws elasticsearch 5.5 and aws elasticsearch 6.0 and the results are the same.
Following is my code:
from elasticsearch import Elasticsearch
import elasticsearch
import curator
def handler():
client = elasticsearch.Elasticsearch(['http://XX.153.17.133:9200'])
ilo = curator.IndexList(client)
print ilo.all_indices
print ilo.filter_by_regex(kind='prefix', value='mov')
print ilo.filter_by_age(source='creation_date', direction='older', unit='seconds', unit_count=2)
#delete_indices = curator.DeleteIndices(ilo)
#delete_indices.do_action()
return
handler()
And following is the output:
/Users/junyu/PycharmProjects/es-curator/bin/python
/Users/junyu/PycharmProjects/es-curator/es-curator.py
[u'movie']
None
None
Process finished with exit code 0
Thank you in advance!
You will only see output from your print ilo.filter* lines if you have logging set up.
If you want to see what indices remain after the filters, then try print ilo.indices, as that is the working list.
I created an alarm and want to delete it afterward...
The documentation for boto 2 doesn't show how to do that.
Any help ?
Thanks
If you want to delete alarms, the API you need is DeleteAlarms. The link you have in your question is mentioning it (search for delete_alarms).
Also, boto 3 is the recommended version to use and here is the API you need: https://boto3.readthedocs.io/en/latest/reference/services/cloudwatch.html#CloudWatch.Client.delete_alarms
Example of how to do it with Boto 3:
import boto3
client = boto3.client('cloudwatch')
client.delete_alarms(AlarmNames=['SomeAlarmName'])
Boto 2 example:
import boto
client = boto.connect_cloudwatch()
client.delete_alarms('SomeAlarmName')
If you don't know the name, you can get a list of alarms with (the same for boto 2 and 3):
client.describe_alarms()
You should use Boto3. But if you are tied to Boto2, then:
import boto
cw = boto.connect_cloudwatch()
alarms= cw.describe_alarms()
for alarm in alarms:
print alarm.name
Check if the alarm you want to delete is listed. Then use that name:
cw.delete_alarms([<alarm_to_be_deleted>])