摘要: 项目需求:爬取100个百度百科网络爬虫词条以及相关词条的标题、摘要和链接等信息。 架构设计:爬虫调度器、URL管理器、HTML下载器、HTML解析器和数据存储器。 阅读全文
posted @ 2017-11-23 09:02 pop_PY 阅读(145) 评论(0) 推荐(0) 编辑
摘要: #!/usr/bin/env python#-*-coding:utf-8-*-#__author__ = 王小刀data = [100,1,55,23,5,15,7,14,8,29,18,9,2]count=0for j in range(1,len(data)): for i in range( 阅读全文
posted @ 2017-09-26 10:09 pop_PY 阅读(137) 评论(0) 推荐(0) 编辑
摘要: #-*-coding:utf-8-*-#__author__ = 王小刀n=input("input the numbers")a=[[col for col in range(n)] for row in range(n)]k=0print("原来的矩阵型状")for h in range(n): 阅读全文
posted @ 2017-09-13 11:47 pop_PY 阅读(211) 评论(0) 推荐(0) 编辑
摘要: #-*-coding:utf-8-*-#__author__ = 王小刀def w1(main_func): def outer(request,kargs): print('before') main_func(request,kargs) print('after') return outer@ 阅读全文
posted @ 2017-09-08 16:09 pop_PY 阅读(86) 评论(0) 推荐(0) 编辑
摘要: def cash_out(amount): while amount >0: amount-=1 yield 1 print('get again %s' %amount)atm=cash_out(5)print(type(atm))print(atm.next())print(234234)pri 阅读全文
posted @ 2017-09-07 15:29 pop_PY 阅读(219) 评论(0) 推荐(0) 编辑
摘要: f = open("1.log", 'r')f1 = open("2.txt",'w')for i in f: if "qSrcAddr=180.163.194.15" in i: f1.write(i) print(i)f.close()f1.close() 阅读全文
posted @ 2017-09-07 14:03 pop_PY 阅读(181) 评论(0) 推荐(0) 编辑
摘要: f=open("1.log",'r')s=f.readlines()for i in range(911): if "qSrcAddr=180.163.194.15" in s[i]: print(s[i])f.close() 阅读全文
posted @ 2017-09-04 20:04 pop_PY 阅读(140) 评论(0) 推荐(0) 编辑
摘要: # f=open('test.txt','w')# f.write('王why1234')# f.close()f=open('test.txt','r+')f.seek(5)f.truncate()f.close() 阅读全文
posted @ 2017-09-04 16:08 pop_PY 阅读(115) 评论(0) 推荐(0) 编辑
摘要: import copydic={ "cpu":[80,], "mem":[80,], "disk":[80,]}print('before',dic)new_dic=copy.deepcopy(dic)dic["cpu"][0]=50print(new_dic)print(dic) import c 阅读全文
posted @ 2017-08-23 09:56 pop_PY 阅读(87) 评论(0) 推荐(0) 编辑
摘要: import collectionsMytupleClass = collections.namedtuple('MytupleClass',['x','y','z'])obj=MytupleClass(11,22,33)print(obj.x)print(obj.y)print(obj.z)pri 阅读全文
posted @ 2017-08-22 16:33 pop_PY 阅读(214) 评论(0) 推荐(0) 编辑