Python Multiprocessing For Function That Adds Headers To URLs From List
In Python, I have a function that takes a list of URL's(url_list) and add
headers. The list can have up to 25,000 urls so I wanted to try and make
use multiprocessing. I tried the following code but I think it isn't
truely doing multilprocessing because of the join. How can I make this
truely mutliprocessing?
def do_it(url_list, headers):
for i in url_list:
print "adding header to: \n" + i
requests.post(i, headers=headers)
print "done!"
value = raw_input("Proceed? Enter [Y] for yes: ")
if value == "Y":
p = Process(target=do_it, args = (url_list, headers))
p.start()
p.join()
else:
print "Operation Failed!"
No comments:
Post a Comment