I've noticed your django-celery-email project and want to share some experience about this approach.
Several months ago I implemented almost the same app for project I'm working on and it was a fail.
It works fine for small email volumes (100-200 emails at once) but then we send about 5k emails at once. I thought that message queue system would help, but it instead made things worse.
Because messages where passed as task parameters they were stored in rabbitmq queues. This causes extreme memory consumption. The server memory was quickly consumed, rabbitmq crashed, server hangs, emails weren't delivered and even worse we don't know which emails were delivered and which weren't. Maybe the rate_limit attribute of my task was the source of the problem, I don't know.
Later I discovered that dedicated email MTA's such as exim are good at sending emails: when you send email via exim it returns immediately and puts the email into it's efficient internal queue so there is no need to use celery for sending emails if local MTA is used.
The end of story: we switched to MailChimp api and don't send high email volumes ourselves now. Registration emails etc. are now sent using standard local exim without any troubles.
If this app is not an experiment then I'd suggest to test it on relatively high email volumes before putting in production. Maybe storing messages in DB and putting message id's in parameters instead of full messages would help.
It is possible that the described fault was my fault (I don't have much experience with rabbitmq). I put this as an issue for you app because my app was almost the same.