![]() ![]() When the simple-worker flag is passed, the default batchsize is 1 instead of 10. This helps simply the mental model required, as messages are not on both the SQS queue and an internal queue. The Simple Process Worker differs in the following way from the original implementation.ĭoes not use an internal queue and removes support for the prefetch-multiplier flag. To use a simpler version of PyQS that deals with some of the edge cases in the original implementation, pass the simple-worker flag. $ pyqs send_email -concurrency 10 Simple Process Worker This will spawn additional processes to work through If you want to run more workers to process tasks, you can up theĬoncurrency. We can also read from multiple different queues with one call byĭelimiting with commas: $ pyqs send_email,read_email,write_email If we want want to run all tasks with a certain prefix. PYTHON_PATH to be imported, we can just run: $ pyqs _email You can also specify the function path if you want to reference a function in a different project: ( custom_function_path = "_email" ) # This references function send_email in foo/bar.py instead of email/tasks.py def send_email ( subject ): pass Reading Tasks For example the following function lives in email/tasks.py. If you don’t pass a queue, PyQS will use the function path as the queue Too special to talk to AWS, it only creates the appropriate boto NOTE: This assumes that you have your AWS keys in the appropriateĮnvironment variables, or are using IAM roles. from pyqs import task ( queue = 'email' ) def send_email ( subject, message ): pass send_email. Creating TasksĪdding a task to queue is pretty simple. This comes from SQS having a very simple API. PyQS uses some very simple semantics to create and read tasks. It uses boto3 under the hood toīe installed in all the usual ways. It has a stable API and has been deployed in production, but we have not received feedback from a large number of use cases, and it is possible there are unknown bugs. ![]()
0 Comments
Leave a Reply. |