Diagnosing and fixing the root cause

Troubleshooting Microservice’s OutOfMemoryError: Metaspace

Troubleshooting Microservice’s OutOfMemoryError: Metaspace

Diagnosing and fixing the root cause

Troubleshooting Microservice’s OutOfMemoryError: Metaspace


Recently we confronted an interesting ‘java.lang.OutOfMemoryError: Metaspace’ problem in a Microservice application. This Microservice application will run smoothly for an initial few hours, but later it will start to throw java.lang.OutOfMemoryError: Metaspace. In this post, let me share the steps we pursued to troubleshoot this problem.

Different types of OutOfMemoryError

JVM memory has the following regions:

  • a. Young Generation
  • b. Old Generation
  • c. Metaspace
  • d. Others region

When you encounter ‘java.lang.OutOfMemoryError: Metaspace’, it indicates that the Metaspace region in the JVM memory is getting saturated. Metaspace is the region where metadata details that are required to execute your application are stored. In a nutshell, it contains class definitions, method definitions, and other metadata of your application. To learn more about what gets stored in each of the JVM memory regions, you may refer to this video clip.

Note: There are 9 different types of java.lang.OutOfMemoryError. You can learn about those OutOfMemoryError here. 'java.lang. OutOfMemoryError: Metaspace’ is one type of them, but it is not a common type though.

Diagnose java.lang.OutOfMemoryError: Metaspace

The best place to start debugging ‘java.lang.OutOfMemoryError’ is the Garbage Collection log. If you haven’t enabled a garbage collection log for your application, you may consider enabling it by passing the JVM arguments mentioned here. Enabling a garbage collection log doesn’t add a noticeable overhead to your application. Thus, it’s recommended to enable a garbage collection log on all production JVM instances. To see the great benefits of garbage collection log, refer to this post.

We uploaded the garbage collection log of this troubled microservice application to the GCeasy – GC log analysis tool. Here is the GC log analysis report generated by the tool. Below is the Heap usage graph reported by the tool.

Heap usage graph reported by GCeasy

Fig. 1: Heap usage graph reported by GCeasy

I would like to highlight few observations in this graph:

a. The red triangle in the graph indicates the occurrence of the Full Garbage Collection event. When the Full Garbage Collection event runs, it pauses your entire application. It tries to free up memory from all the regions (Young, Old, Metaspace) in the memory. You can see Full Garbage Collection events to be running consecutively from 12:30am.

b. Even though the maximum heap memory size is 2.5GB, Full Garbage Collection events were consecutively triggered even when heap ...