June 25, 2021
Grand Central Dispatch tutorial

Released by Apple in 2009, Grand Central Dispatch was built to manage tasks that run concurrently, queue them for execution, and schedule them to execute on unoccupied processors in the background.

Released under the Apache License, GCD contains an open source library commonly referred to as GCD or libdispatch. This library executes heavy task operations in the background, keeping the main thread running smoothly and providing faster responding times.

GCD works with tasks as an empty block or function that doesn’t take in arguments.

We’ll discuss how GCD provides multithreading and queues as well as explore what issues GCD solves with its techniques and features.

Threads, multithreading, and queues in Grand Central Dispatch

To really understand GCD, we’ll review threads, multithreading, and queues.

Threads

Threads in GCD contain the main thread and background thread; all tasks execute on these threads. It’s important to keep the main thread as free as possible so our UI stays fast and responsive. Any heavy tasks performed must be pushed to the background thread.

Multithreading

Because the CPU performs one task at a time, we can use multithreading to make the CPU switch between tasks, allowing it to execute multiple tasks at the same time.

Multithreading increases responsiveness and decreases lag when performing multiple tasks, ensuring the main thread isn’t interrupted.

Queue

A queue resembles a bus line. Imagine people at a bus station waiting to enter a bus. The first in the line, which represents the queue, enters the bus and leaves the line, which in programming is called first in, first out.

In GCD, a queue is a group of code blocks or functions waiting to be executed on a thread.

There are two types of queues:

A serial queue that executes tasks from first to last, one at a time
A concurrent queue executes all tasks simultaneously; tasks finish based on the operation being performed

In relation to threads, the concurrent queues work on the background thread while the main thread is the serial queue. An app uses the main queue to execute tasks serially and dispatches the heavy lifting tasks to the concurrent queues. This is where Grand Central Dispatch comes in.

What does Grand Central Dispatch solve?

Without the ability to perform heavy tasks on the background thread, tasks would be done serially, slowing down performance.

But by creating queues and placing blocks of code into the queue, GCD takes the long and heavy tasks to the background thread for execution. The blocks of code in the queue awaiting execution are held in a closure.

GCD techniques

The techniques GCD uses provide comprehensive support for executing concurrent code.

We’ll review some of the GCD techniques that are useful for iOS developers, including:

How to schedule and manage tasks with DispatchQueue

Grouping and synchronizing tasks as a single unit with DispatchGroup

DispatchQueue

This is where GCD creates tasks by packaging them into a block or function and placing it into a queue, either in order of execution or complexity. This keeps our queues of tasks in order and helps execute them serially or concurrently.

There are three types of DispatchQueue:

The main queue is serial, runs on the main thread, and is used for UI-related operations
Global queues are concurrent queues and execute tasks in order of priority
Custom queues are customized serial and concurrent queues

Create a DispatchQueue with the following:

let dispatchQueue = DispatchQueue(label: “myqueue”)

We can then make tasks execute synchronously or asynchronously by adding the code below:

let dispatchQueue = DispatchQueue(label: “myqueue”)

// Code executes synchronously
dispatchQueue.sync {
print(“Do something synchronous”)
}

// Code executes asynchronously
dispatchQueue.async {
print(“Do something asynchronous”)
}

If we want to execute our tasks synchronously only, we add the following:

let dispatchQueue = DispatchQueue(label: “myqueue”)

// Task is synchronous
dispatchQueue.sync {
// Set timeinterval
Thread.sleep(forTimeInterval: 5)
print(“Do something synchronous”)
}

print(“Do something”)

// Do something synchronous
// Do something

Notice that print(“Do something”) must wait for the first task to complete. In this instance, the first task delays for 5s, executes, and moves on to the next code.

If we need our tasks to run asynchronously, DispatchQueue can still help us. When running tasks asynchronously, they complete on their own time while the main thread still executes synchronously:

let dispatchQueue = DispatchQueue(label: “myqueue”)

dispatchQueue.async {
Thread.sleep(forTimeInterval: 1)
print(“Do something asynchronous”)
}

print(“Do something”)

// Do something
// Do something asynchronous

Because we ran dispatchQueue.async for “Do something asynchronous”, “Do something” returns first.

DispatchGroup

The main purpose of a DispatchGroup is to wait for data. As tasks complete, they wait for the entire group to complete before moving to the next operation. So, when we have groups of concurrent tasks, a DispatchGroup notifies us when the tasks are completed.

When we create a DispatchGroup, we can create a custom concurrent and serial queue with their asynchronous tasks linking to the same group.

If we want to be notified when our tasks in each group are complete, we can use the group.notify, which comes from the DispatchGroup we declared earlier:

let group = DispatchGroup()

// Concurrent queue
let queue = DispatchQueue(label: “com.logrocket.concurrentqueue”, attributes: .concurrent)

// Link the Queue to Group
queue.async(group: group) {
Thread.sleep(forTimeInterval: 3)
print(“My first task”)
}

// Serial queue
let queue2 = DispatchQueue(label: “com.logrocket.serialqueue”, attributes: .serial)

// Link the Queue to Group
queue2.async(group: group) { // Group Linked
print(“My second task”)
}

// Notify us for completion of tasks on main thread.
group.notify(queue: DispatchQueue.main) {
Thread.sleep(forTimeInterval: 1)
print(“All Concurrent task done”)
}

print(“Waiting for tasks”)

/* Output:
My Task
Concurrent task done
Waiting for tasks
*/

This works similarly to semaphores but, in our case, if our tasks are not touching a shared resource, we must use a DispatchGroup.

Conclusion

With GCD, we can move the responsibility of managing threads from the main application to the operating system. Achieving a concurrent execution pattern is no longer something a developer must be responsible for.

The post Grand Central Dispatch tutorial appeared first on LogRocket Blog.

Leave a Reply

Your email address will not be published. Required fields are marked *

Send