K6 + Grafana For Stress Test
Hello everyone! For 2 or 3 months without writing anything, I'm back here to share after work on a stress testing task. Talking like I’m sharing but actually, this topic is my note for me before I’m starting to forget what I’ve been done 😂. If this article helps you somehow, I’ll be glad too 😊.
Before we start, if you want to skip this content, just jump to the end of this article, I put my template in there.
Okay, let’s get started!
Base on this document of K6, we will need Influxdb as our database to store testing results and we’ll need a Grafana server too (of course 😺) to visualize all output.
Using the above docker-compose file and starting it up, now we can access Influxdb and Grafana on our local at port 8086 and 3000. Enter the Grafana with url http://localhost:3000
, setup connection to a new data source like this:
After that, you can create your own dashboard to view your testing output or you just can import my template for fast.
Ah yes! Not to forget, if we don’t have data in our database so how could we view it 😄. For inputting K6 results to Influxdb, we have this simple command k6 run -o influxdb=http://localhost:8086/<influxdb name> <script>
In this article, I use my test script with the option of 2000 VUs, ramping up in 30 seconds for the total duration is 1 minute.
The errorRate
is a custom metric so we can statistic the failure request on our dashboard. The tested server I used in this example is no else but just an http server of python, you can start it by executing python3 -m http.server
.
After finished setting up all what we need, here’s a sample result dashboard:
*Some More Note*
As we know that K6 can create a huge number of virtual users (about 30k - 40k), someone will still need more than that. So this is a note for people who desire.
Though K6 doesn’t support clusters, we still can use K6 along with k8s to solve this problem. Building an image to run the test, deploying it to k8s with multi-instance. That would be the way 😋. Read these if you want more detail:
Ah, this is my source on github and thank you for spending your time with my topic!!!