django-autoslug field causes deadlocks OR does not work for unique fields
first, thank you for a great and very useful project.
I have a project, where I use unique slug fields.
The problem is, that when I import a lot of records, using multiple processess (2 per CPU=16 running processess, importing data), I get exceptions about violation of uniqueness for slug field.
Why? Well, the answer is simple. Between checking a field for uniqueness in pre_save, passing that test and actually saving it, it is possible that another record with the same value for slug field gets saved AND we have a problem with duplicate unique field.
Running every process in transaction is also a no-option, as it causes deadlocks with PostgreSQL, but that's another story.
Do you have any ideas on how this could be fixed?
Wrapping obj.save in transaction does not seem to help much.
What comes to my mind ATM is, that I could drop the uniqueness constraint AND check for double (non-unique) slug values after importing all the data and updating slugs by that time and THEN I could re-enable uniqueness for the slug field, but that does not seem to be optimal solution.
Beware -- this situation could appear in production setting. Unlikely, but can happen.