Other methods of the process
P = Process(target=f,)
P.Pid view process No. view process name p.name
P.is_alive(), return a true or False
P.terminate() sends a signal to the operating system to end the process
Verify that processes are spatially isolated
from multiprocessing import Process num = 100 def f1(): global num num = 3 print(num) # Result 3 if __name__ == '__main__': p = Process(target=f1,) p.start() p.join() print(num) # Result 100
Daemon
When the code of the main process finishes running, the subprocesses set as daemons will end
P.daemon = True
import time from multiprocessing import Process def f1(): time.sleep(3) print('xxxx') def f2(): time.sleep(5) print('Code for normal subprocesses') if __name__ == '__main__': p = Process(target=f1,) p.daemon = True p.start() p2 = Process(target=f2,) p2.start() print('End of main process') # The daemons will end with the code that follows the parent,End
Process lock
Data sharing manager
When data (file content) is shared by multiple processes, it will cause data insecurity
To ensure data security, but at the expense of efficiency, the code with lock becomes the state of (synchronous) serial execution, also known as synchronous lock / mutually exclusive lock
Two ways to lock
l = Lock()
1).with l:
Content in lock
2).
l.acquire()
Lock contents
l.release()
import time from multiprocessing import Process,Manager,Lock def f1(m_d,l2): with l2: # l2.acquire() tmp = m_d['num'] tmp -= 1 time.sleep(0.1) m_d['num'] = tmp # l2.release() if __name__ == '__main__': m = Manager() l2 = Lock() m_d = m.dict({'num':100}) p_list = [] for i in range(10): p = Process(target=f1,args=(m_d,l2)) p.start() p_list.append(p) [pp.join() for pp in p_list] print(m_d['num'])
queue
Queue()
Q = Queue(10)
Q.put() put data
Q.get() fetching data
Q.qsize() returns the length of content in the current queue
Q.put_nowait(), do not wait, but report an error
Q.get ﹣ nowait(), not waiting, is also an error
Q.full() q.empty()
Using queue to implement consumer producer model: buffering, decoupling,
import time from multiprocessing import Process,Queue,JoinableQueue
#Producer def producer(q): for i in range(10): time.sleep(0.2) s = 'Big stuffed bun%s Number'%i print(s+'Fresh baked') q.put(s) q.join() #Just wait. task_done()Number of signals,And I put When the number of entries is the same,To continue print('All the tasks have been dealt with') #Consumer def consumer(q): while 1: time.sleep(0.5) baozi = q.get() print(baozi+'Be eaten') q.task_done() #Send a signal to the queue that the task has been processed if __name__ == '__main__': # q = Queue(30) q = JoinableQueue(30) #It's also a queue with a length of 30 pro_p = Process(target=producer,args=(q,)) con_p = Process(target=consumer,args=(q,)) pro_p.start() con_p.daemon = True con_p.start() pro_p.join() print('End of main process')