I am attempting to parse the below XML file but having difficulty getting a specific element value. I am trying to specify element 'Item_No_2' to get the related value <v>2222222222</v>
but am unable to do it using get.element('Item_No_2'). Am I using the get.element value incorrectly?
XML File:
<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/xsl" href="Data.xsl"?>
<abc>
<md>
<mi>
<datetime>20160822020003</datetime>
<period>3600</period>
<it>Item_No_1</it>
<it>Item_No_2</it>
<it>Item_No_3</it>
<it>Item_No_4</it>
<it>Item_No_5</it>
<it>Item_No_6</it>
<it>Item_No_7</it>
<ovalue>
<v>1111111111</v>
<v>2222222222</v>
<v>3333333333</v>
<v>4444444444</v>
<v>5555555555</v>
<v>6666666666</v>
<v>7777777777</v>
</ovalue>
</mi>
</md>
</abc>
My Code:
from xml.etree.ElementTree import parse
doc = parse('test.xml').getroot()
for element in doc.findall('md/mi/'):
print(element.text)
for element in doc.findall('md/mi/ovalue/'):
print(element.text)
The current output gets them separately but I can't seem to understand how to call a specific element value.
Output:
20160822020003
3600
Item_No_1
Item_No_2
Item_No_3
Item_No_4
Item_No_5
Item_No_6
Item_No_7
1111111111
2222222222
3333333333
4444444444
5555555555
6666666666
7777777777
Tried this but did not work:
for element in doc.findall('md/mi/ovalue/'):
print(element.get('Item_No_1'))
There is no Item_No_1
at the elements that are found by doc.findall('md/mi/ovalue/')
.
I think what you may try to do is get both lists
items = [e.text for e in doc.findall('md/mi/it')]
values = [e.text for e in doc.findall('md/mi/ovalue/v')]
Then find the index of the string 'Item_No_1'
from items
, and then index into values
with that number.
Alternatively, zip
the two lists together and check when you find one element.
for item,value in zip(doc.findall('md/mi/it'), doc.findall('md/mi/ovalue/v')):
if item.text == 'Item_No_1':
print(value.text)
There might be a better way, but those are the first ways that come to mind
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With