windows和python2.7环境下scrapy安装步骤
一、安装Python2.7官方主页:http://www.python.org/下载地址:http://www.python.org/ftp/python/2.7.3/python-2.7.3.msi1) 安装python安装目录:C:\Python27 2) 添加环境变量D:\Python27;D:\Python27\Scripts
一、安装Python2.7
下载地址:http://www.python.org/ftp/python/2.7.3/python-2.7.3.msi
1) 安装python
安装目录:C:\Python27
2) 添加环境变量
D:\Python27;D:\Python27\Scripts
二、安装setuptools
官方主页:http://pypi.python.org/pypi/setuptools
下载地址:http://pypi.python.org/packages/2.7/s/setuptools/setuptools-0.6c11.win32-py2.7.exe
三、安装Zope.Interface
官方主页:http://pypi.python.org/pypi/zope.interface/
下载地址:http://pypi.python.org/packages/2.7/z/zope.interface/zope.interface-4.0.1-py2.7-win32.egg
安装过程:
C:\>cd C:\Python27\Scripts C:\Python27\Scripts>easy_install.exe zope.interface-4.0.1-py2.7-win32.egg
安装过程出现的错误解决:猜测可以忽略,因为在python模式下输入import zope.interface 不报错则安装成功,无错误
四、 安装Twisted官方主页:http://twistedmatrix.com/trac/wiki/TwistedProject
下载地址:http://pypi.python.org/packages/2.7/T/Twisted/Twisted-12.1.0.win32-py2.7.msi
五、安装w3lib
官方主页:http://pypi.python.org/pypi/w3lib
下载地址: http://pypi.python.org/packages/source/w/w3lib/w3lib-1.2.tar.gz
解压目录:C:\w3lib-1.2
安装过程:C:\w3lib-1.2>python setup.py install
六、安装libxml2
官方主页:http://users.skynet.be/sbi/libxml-python/http://pypi.python.org/pypi/pyOpenSSL
下载地址:http://users.skynet.be/sbi/libxml-python/binaries/libxml2-python-2.7.7.win32-py2.7.exe
七、安装pyOpenSSL
官方主页:http://pypi.python.org/pypi/pyOpenSSL
下载地址:http://pypi.python.org/packages/2.7/p/pyOpenSSL/pyOpenSSL-0.13.winxp32-py2.7.msi
八、安装Scrapy
官方主页:http://scrapy.org/
下载地址:http://pypi.python.org/packages/source/S/Scrapy/Scrapy-0.14.4.tar.gz
解压目录:C:\Scrapy-0.14.4
安装过程:C:\Scrapy-0.14.4>python setup.py install
九、验证安装:
验证安装:C:\python27>scrapy
Scrapy 0.14.4 - no active project
Usage:
scrapy <command> [options] [args]
Available commands:
fetch Fetch a URL using the Scrapy downloader
runspider Run a self-contained spider (without creating a project)
settings Get settings values
shell Interactive scraping console
startproject Create new project
version Print Scrapy version
view Open URL in browser, as seen by Scrapy
Use "scrapy <command> -h" to see more info about a command
成功!!!
更多推荐
所有评论(0)