- read

Concurrency neither good nor bad

Yoga Bagasakthi 1

As a developer we want to write some good code to develop top application. And there are several criteria that our application must fulfill to become top application, one of them is how our application handle concurrent process. Concurrency is interesting word that can be different meaning on different people, but in my opinion concurrency is where the process of your program run simultaneously. Let’s take a look the illustration below

Illustration of how they work between concurrent vs parallel

If you take a look of the illustration above, perhaps you can identify how concurrent work. A real-life example of concurrency and parallelism is a chef cooking in a kitchen. A chef has to perform multiple tasks simultaneously while cooking. For instance, while they are frying a steak, they may also be chopping vegetables, making a sauce, or cooking pasta.

In this case, the chef is using concurrency. They are not doing one task at a time, but they are managing multiple tasks concurrently. The chef has to be aware of the status of each task, and switch between them as needed to ensure everything is done on time.

Now, let’s add parallelism to the mix. Imagine the chef has a sous chef who can work alongside them, and they can divide the tasks between them. For instance, the chef can be cooking the steak while the sous chef is making the sauce.

This is parallelism. The chef and sous chef are working on separate tasks simultaneously, making the overall process more efficient. They can complete more tasks in less time, and the quality of the food may even improve because they are not as rushed.

Overall, this real-life example shows how concurrency and parallelism can work together to make a process more efficient and effective.

Now let’s we jump into code example:

package main

import (
"fmt"
"time"
)

func task1() {
time.Sleep(2 * time.Second)
fmt.Println("Task 1 Done")
}

func task2() {
time.Sleep(1 * time.Second)
fmt.Println("Task 2 Done")
}

func main() {
go task1()
go task2()

time.Sleep(3 * time.Second)
fmt.Println("All tasks done!")
}

In this example, we have two functions (task1 and task2) that simulate long-running tasks using time.Sleep in each function task1 and task2 then run these tasks concurrently using the go keyword. Finally, we wait for all tasks to complete using time.Sleep in main function.

There are bunch of opportunity cost while using concurrency, such as:

  1. Improved Performance: Concurrency allows multiple tasks to be executed simultaneously, which can lead to faster execution and improved performance of the system.
  2. Better Resource Utilization: By allowing multiple tasks to run concurrently, the system can better utilize available resources such as CPU, memory, and I/O devices.
  3. Scalability: With concurrency, it is easier to scale up the system as the workload increases by adding more resources such as CPUs, threads, or processes.
  4. Responsiveness: Concurrency can improve the responsiveness of the system by allowing tasks to run in the background without blocking the main thread of execution.
  5. Simplicity: With the use of channels and other abstractions, concurrency can make it easier to write simple and maintainable code.

But also there are some trade-offs that you have to reckoned, such as:

  1. Complexity: Concurrency adds complexity to the code and can make it harder to reason about the behavior of the system. It requires careful coordination between different tasks to avoid issues like race conditions and deadlocks.
  2. Debugging: When things go wrong with concurrent programs, debugging can be difficult. It can be challenging to reproduce and isolate the problem and figure out the cause of the issue.
  3. Overhead: Concurrency comes with some overhead in terms of memory usage and scheduling overhead. When creating too many threads, the system may become less efficient due to the overhead of managing them.
  4. Timing Issues: With concurrent systems, timing can be a critical issue. For example, if one task depends on the result of another task, the timing of the two tasks can affect the correctness of the system.
  5. Synchronization: Synchronization is required to ensure that different tasks do not interfere with each other. This synchronization can be complex and can introduce additional overhead and potential issues like deadlocks.

So, the conclusion of this article is concurrency must be implemented on correct way and at the right case or you are going to facing the bad of concurrency like deadlock, livelock, and starvation. Sounds horrible and strange, right?

References: