http://stackoverflow.com/questions/7718801/is-a-gcd-dispatch-queue-enough-to-confine-a-core-data-context-to-a-single-thread?lq=1
Concern
One big advantage of GCD dispatch queues is that it manages and makes use of multiple threads to process its FIFO tasks as needed. Each task gets executed by one thread. However, the it “may” be a different thread that processes the next task.
So – if I understand this right – tasks I hand off to one and the same dispatch queue, could end up running in different threads, potentially handing off a core data context from one thread to another, and having things go wrong. Is that right?
Answer
The accepted answer to that question, using GCD queues, does ensure that a new context is created on each thread, but does not point out the necessity of doing this.
The big thing you need to remember is that you must avoid modifying the managed object context from two different threads at the same time.
That could put the context into an inconsistent state, and nothing good can come of that. So, the kind of dispatch queue that you use is important: a concurrent dispatch queue would allow multiple tasks to proceed simultaneously, and if they both use the same context you’ll be in trouble.
If you use a serial dispatch queue, on the other hand, two or more tasks might execute on different threads, but the tasks will be executed in order, and only one task will run at a time. This is very similar to running all the tasks on the same thread, at least as far as maintaining the context’s consistency goes.
See this question and answer for a much more detailed explanation.
This is how Core Data has always worked. The Concurrency with Core Data section of the Core Data Programming Guide gives advice on how to proceed if you do decide to use a single context in multiple threads. It talks mainly about the need to be very careful to lock the context any time you access it. The point of all that locking, though, is to ensure that two or more threads don’t try to use the context simultaneously.
Using a serialized dispatch queue achieves the same goal: because only one task in the queue executes at a time, there’s no chance that two or more tasks will try to use the context at the same time.