How do I check for the presence of a particular layer in a scapy packet? For example, I need to check the src/dst fields of an IP header, how do I know that a particular packet actually has an IP header (as opposed to IPv6 for instance).
My problem is that when I go to check for an IP header field, I get an error saying that the IP layer doesn't exist. Instead of an IP header, this particular packet had IPv6.
pkt = Ether(packet_string)
if pkt[IP].dst == something:
# do this
My error occurs when I try to reference the IP layer. How do I check for that layers existence before attempting to manipulate it?
Thanks!
Sniffing packets using scapy: To sniff the packets use the sniff() function. The sniff() function returns information about all the packets that has been sniffed. To see the summary of packet responses, use summary(). The sniff() function listens for an infinite period of time until the user interrupts.
sprintf is a function to format a packet's data in a human readable form.
Send and receive packets (sr) The function sr1() is a variant that only returns one packet that answered the packet (or the packet set) sent. The packets must be layer 3 packets (IP, ARP, etc.). The function srp() do the same for layer 2 packets (Ethernet, 802.3, etc.).
Reading a pcap file with Scapy, is commonly done by using rdpcap() . This function reads the whole file and load it up in memory, depending on the size of the file you're trying to read can take quite some memory.
For completion I thought I would also mention the haslayer
method.
>>> pkts=rdpcap("rogue_ospf_hello.pcap")
>>> p=pkts[0]
>>> p.haslayer(UDP)
0
>>> p.haslayer(IP)
1
Hope that helps as well.
You should try the in
operator. It returns True
or False
depending if the layer is present or not in the Packet
.
root@u1010:~/scapy# scapy
Welcome to Scapy (2.2.0-dev)
>>> load_contrib("ospf")
>>> pkts=rdpcap("rogue_ospf_hello.pcap")
>>> p=pkts[0]
>>> IP in p
True
>>> UDP in p
False
>>>
root@u1010:~/scapy#
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With