pool = redis.ConnectionPool(host=host, port=port) client = redis.StrictRedis(connection_pool=pool) for i in range(10000): for j in range(30): client.lpush(IDLE_TASKS, json.dumps(args))
This kind of execution efficiency is terrible.
You need to wait dozens of seconds before the insertion is completed.
Is there any more efficient way to handle this?
args is just a tuple content (1,2,"3") or something like that
Because I personally have never used the redis library, so I can only try to give some suggestions based on the code you gave. If you don’t like it, don’t criticize:
1. I don’t know where your
args
came from, but there seems to be no change in the loop body, so can you put thisjson.dumps(args)
outside the loop body and execute it:2. Seeing that you need to generate about 300,000 pieces of the same data, can you generate this data first and then run it
client.lpush
? Because after all, tcp also has its own delay factors3. You can use the
cProfile
library to find out what takes a long time, or you can try to use another library to implement it (you have to google this for details)