I want to print an attribute value based on its name, take for example
<META NAME="City" content="Austin">
I want to do something like this
soup = BeautifulSoup(f) # f is some HTML containing the above meta tag for meta_tag in soup("meta"): if meta_tag["name"] == "City": print(meta_tag["content"])
The above code give a KeyError: 'name'
, I believe this is because name is used by BeatifulSoup so it can't be used as a keyword argument.
Beautifulsoup: Get the attribute value of an elementFind all by ul tag. Iterate over the result. Get the class value of each element.
BeautifulSoup has a limited support for CSS selectors, but covers most commonly used ones. Use select() method to find multiple elements and select_one() to find a single element.
find() method The find method is used for finding out the first tag with the specified name or id and returning an object of type bs4. Example: For instance, consider this simple HTML webpage having different paragraph tags.
If you want to retrieve multiple values of attributes from the source above, you can use findAll and a list comprehension to get everything you need: import urllib f = urllib. urlopen("http://58.68.130.147") s = f. read() f.
It's pretty simple, use the following -
>>> from bs4 import BeautifulSoup >>> soup = BeautifulSoup('<META NAME="City" content="Austin">') >>> soup.find("meta", {"name":"City"}) <meta name="City" content="Austin" /> >>> soup.find("meta", {"name":"City"})['content'] u'Austin'
Leave a comment if anything is not clear.
theharshest answered the question but here is another way to do the same thing. Also, In your example you have NAME in caps and in your code you have name in lowercase.
s = '<div class="question" id="get attrs" name="python" x="something">Hello World</div>' soup = BeautifulSoup(s) attributes_dictionary = soup.find('div').attrs print attributes_dictionary # prints: {'id': 'get attrs', 'x': 'something', 'class': ['question'], 'name': 'python'} print attributes_dictionary['class'][0] # prints: question print soup.find('div').get_text() # prints: Hello World
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With