1. Tubelator AI
  2. >
  3. Videos
  4. >
  5. Education
  6. >
  7. Concurrency vs Parallelism

Concurrency vs Parallelism

Available In Following Subtitles
English
Variant 1
Posted on:
Video by: Defog Tech
Clear the confusion about parallelism and concurrency, and what tools Java provides to enable each concept. Channel ---------------------------------- Complex concepts explained in short & simple manner. Topics include Java Concurrency, Spring Boot, Microservices, Distributed Systems etc. Feel free to ask any doubts in the comments. Also happy to take requests for new videos. Subscribe or explore the channel - https://youtube.com/defogtech Popular Videos ---------------------------------- What is an API Gateway - https://youtu.be/vHQqQBYJtLI Executor Service - https://youtu.be/6Oo-9Can3H8 Introduction to CompletableFuture - https://youtu.be/ImtZgX1nmr8 Java Memory Model in 10 minutes - https://youtu.be/Z4hMFBvCDV4 Volatile vs Atomic - https://youtu.be/WH5UvQJizH0 What is Spring Webflux - https://youtu.be/M3jNn3HMeWg Java Concurrency Interview question - https://youtu.be/_RSAS-gIjGo
tubelator logo

Instantly generate YouTube summary, transcript and subtitles!

chrome-icon Install Tubelator On Chrome

Video Summary & Chapters

No chapters for this video generated yet.

Video Transcript

0:00
Let's start with parallelism
0:03
Let's take a look at this code
0:05
Here we have a main function and in the main function. We are starting two threads in
0:11
the first thread we are calling this function called process tax for user 1 and
0:17
In the second thread we are calling the process tax for user 2
0:22
Once we start these two threads, we are calling this function called heavy calculations
0:26
operations, which is essentially a task 3.
0:31
So we have 3 tasks, task 1 and task 2 are run on separate threads and task 3 will be
0:38
run by the main thread.
0:40
If we use Java 8, we can reduce this code to look something like this.
0:46
If we run this program on a quad core CPU, which means that there are four cores in our processor,
0:54
then we can run these three tasks in parallel. In OS, there is a component called scheduler,
1:02
which is responsible to schedule the threads onto the CPUs. So in this case, it is possible that
1:09
the main thread will run on core 1 and thread 1 and thread 2 will run on core 3 and core
1:16
2 respectively.
1:18
Since none of the tasks are dependent on each other, all three tasks will run on three separate
1:24
cores in parallel.
1:27
So parallelism is about doing a lot of things at once so that we can speed up our program.
1:34
We can very well write the same code which we saw and here instead of creating our own
1:42
threads we can also create a thread pool and we are creating a fixed thread pool of size
1:48
4 here and we are submitting two tasks.
1:52
So even in this case it is possible that on the CPU three cores are used to run these
1:59
three tasks simultaneously.
2:02
So in Jawa
2:04
To enable parallelism, we can either use raw threads like we did earlier or we can use
2:12
this concept of thread pool.
2:15
Even in thread pool, it could be an executor service, it could be the newer fork join pool
2:21
or it could be custom thread pools which are used by web servers.
2:25
But in all the cases, we'll have to ensure that our CPU has more than one cores so that
2:32
And we can run multiple threads parallelly to speed up our program.
2:39
So that was parallelism, relatively straightforward.
2:43
Now let's look at concurrency.
2:45
For concurrency, let's look at this code.
2:49
Here we have a main function and we have two threads again, thread 1 and thread 2.
2:55
and in thread 1 we are checking if the tickets available to book are more than 0.
3:02
If they are more than 0 then book the ticket and decrement available tickets count.
3:08
We are starting that thread and in thread 2 we are doing the exact same thing.
3:13
So the idea is we can book the ticket only if there are available tickets and once we
3:19
book the ticket we will reduce the available tickets by 1.
3:23
After starting two threads, we'll just put the main thread to sleep for few seconds.
3:27
So here essentially we have two tasks and both the tasks are accessing this shared variable.
3:35
This time let's assume we have only one core on our CPU.
3:39
Even in this case, our scheduler which is responsible for scheduling the threads onto
3:45
the CPU have the same job.
3:48
But it's trickier now because there are no multiple cores.
3:52
So, the scheduler will have to do some time sharing between the threads and to be fair
3:58
with the threads, it is possible that scheduler will schedule the thread 1 for a few milliseconds
4:06
and then kick thread 1 out and put thread 2 in for few milliseconds.
4:12
Once that few millisecond passes, it will again schedule thread 1 to execute and so
4:18
on and so forth.
shape-icon

Download extension to view full transcript.

chrome-icon Install Tubelator On Chrome