多线程 - python多个线程锁可提高效率吗?
阿神
阿神 2017-04-17 17:36:49
0
5
562

[修改部分内容,sorry有两点原来我没有讲清楚,我重新修改了示例代码
1 使用多线程是因为网络IO
2 下面的三个资源都是线程共享的,所以要加锁
]
优化多线程的效率问题

原先的多线程结构如下:
建立一个锁,所有不安全的函数都用这个锁

class():

self.lock = threading.lock()

def out_to_file1(self): #第一个资源
    self.lock.acquire()
    f = open(file1)
    f.write()
    f.close()
    self.lock.release()
    
def out_to_file2(self): #第二个资源
    self.lock.acquire()
    f = open(file2)
    f.write()
    f.close()
    self.lock.release()

def update_scan_count(self): #第三个资源
    self.lock.acquire()
    self.scan_count += 1
    self.lock.release()
    
def _scan(self): #每个线程的内部逻辑
    while self.queue.qsize() > 0 and self.is_continue:
    ...
    此处代码为处理网络IO(我使用多线程的原因)
    ...     
    out_to_file2()
    update_scan_count()
    out_to_file1()
    
def main(self):
    for i in range():
        t = threading.Thread(target=self._scan, name=str(i))
        setdaemon
        t.start()
        t.join()

现在考虑建立三个锁

lock1 = threading.lock()
lock2 = threading.lock()
lock3 = threading.lock()

分别锁住这三个资源

请问这样的效率会提高吗?有其他好的建议吗?多谢!
PS:求提供一些优秀的py多线程开源项目,想阅读学习一下!

阿神
阿神

闭关修行中......

reply all(5)
巴扎黑

It is recommended to use queue Queue for processing. Python3 began to have a packaged concurrect.futures module, which can use multi-threads and multi-processes like ordinary functions.

Ty80

First of all, Python's multi-threading itself is extremely inefficient because of the limitations of the GIL (Global Interpreter Lock) mechanism , its function is simply: For an interpreter, there can only be one thread executing bytecode.
So if you want to pursue the efficiency of multi-threading in the traditional sense, you should use multi-processing (multiprocessing) in the Python world...Python的多线程本身就是效率极低的,因为有GIL(Global Interpreter Lock:全局解释锁)机制的限制,其作用简单说就是:对于一个解释器,只能有一个线程在执行bytecode。
所以如果为了追求传统意义上多线程的效率,在Python界还是用多进程(multiprocessing)吧……

这里你用了多线程,且用了锁来控制公共资源,首先锁这个东西会导致死锁,不加锁反而没有死锁隐患,但会有同步问题。

另外,如果不同线程操作的是不同的文件,是不存在同步问题的,如果操作同一个文件,我建议采用Queue(队列)来处理。

总的来说,用单线程就好了,因为Python

Here you use multi-threading and use locks to control public resources. First of all, locking this thing will cause deadlock. Without locking, there is no deadlock risk, but there will be synchronization problems. 🎜 🎜In addition, if different threads operate different files, there is no synchronization problem. If they operate the same file, I recommend using Queue (queue) to handle it. 🎜 🎜In general, it’s better to use a single thread, because Python multi-threading itself is inefficient, and there is no need to consider synchronization issues with a single thread. If you must pursue efficiency, use multiple processes. Process locks must also be considered. 🎜
阿神
  1. You don’t need a lock to open different files

  2. You don’t need a lock to open the same file

The efficiency of locks is very low. Try to change "lock" to "parallel/queue".

迷茫

No, multi-threading/locking and the like will reduce performance, only single-threaded asynchronous processing will improve performance

伊谢尔伦

This can easily cause a deadlock condition of possession-request, and other conditions (exclusive, non-preemptible, circular request) will cause a deadlock. And the lock strengthens data security, but the performance will inevitably decrease.

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template