Details
-
Bug
-
Resolution: Done
-
Major
-
jboss-fuse-6.3
-
None
-
%
Description
- One of the customer observed slowness with with one of the broker container.
- Customer reported that they are not able to open Hawtio, and karaf terminal also crashes or hangs.
- In Top output we can see that particular processId is consuming High CPU.
high-cpu_sample2.out PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 24377 fuse 20 0 12.566g 6.770g 20236 R 99.9 16.4 71:10.86 java [cpandey@cpandey Downloads]$ printf '%x\n' 24377 5f39 [cpandey@cpandey Downloads]$ less high-cpu-tdump_sample2.out "qtp9475471-79" #79 prio=5 os_prio=0 tid=0x00007fd9506be930 nid=0x5f39 runnable [0x00007fd91283e000] java.lang.Thread.State: RUNNABLE at java.util.TreeMap.put(TreeMap.java:552) at org.apache.felix.cm.impl.CaseInsensitiveDictionary.<init>(CaseInsensitiveDictionary.java:122) at org.apache.felix.cm.impl.ConfigurationImpl.getProperties(ConfigurationImpl.java:326) at org.apache.felix.cm.impl.ConfigurationAdapter.getProperties(ConfigurationAdapter.java:150) at org.apache.karaf.management.internal.BulkRequestContext.getConfiguration(BulkRequestContext.java:119) at org.apache.karaf.management.KarafMBeanServerGuard.getRequiredRoles(KarafMBeanServerGuard.java:369) at org.apache.karaf.management.KarafMBeanServerGuard.getRequiredRoles(KarafMBeanServerGuard.java:358) at org.apache.karaf.management.KarafMBeanServerGuard.canInvoke(KarafMBeanServerGuard.java:251) at org.apache.karaf.management.KarafMBeanServerGuard.canInvoke(KarafMBeanServerGuard.java:241) at org.apache.karaf.management.internal.JMXSecurityMBeanImpl.canInvoke(JMXSecurityMBeanImpl.java:82) at org.apache.karaf.management.internal.JMXSecurityMBeanImpl.canInvoke(JMXSecurityMBeanImpl.java:59) at sun.reflect.GeneratedMethodAccessor152.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:71) at sun.reflect.GeneratedMethodAccessor68.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:275) at com.sun.jmx.mbeanserver.StandardMBeanIntrospector.invokeM2(StandardMBeanIntrospector.java:112) at com.sun.jmx.mbeanserver.StandardMBeanIntrospector.invokeM2(StandardMBeanIntrospector.java:46) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.invoke(PerInterface.java:138) at com.sun.jmx.mbeanserver.MBeanSupport.invoke(MBeanSupport.java:252) at javax.management.StandardMBean.invoke(StandardMBean.java:405) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819) at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801) at io.fabric8.jolokia.RBACRestrictor.canInvoke(RBACRestrictor.java:147) at io.fabric8.jolokia.RBACRestrictor.isAttributeReadAllowed(RBACRestrictor.java:183) at org.jolokia.handler.ReadHandler.checkRestriction(ReadHandler.java:252) at org.jolokia.handler.ReadHandler.fetchAttributes(ReadHandler.java:188) at org.jolokia.handler.ReadHandler.fetchAttributesForMBeanPattern(ReadHandler.java:132) at org.jolokia.handler.ReadHandler.doHandleRequest(ReadHandler.java:116) at org.jolokia.handler.ReadHandler.doHandleRequest(ReadHandler.java:37) at org.jolokia.handler.JsonRequestHandler.handleRequest(JsonRequestHandler.java:161) at org.jolokia.backend.MBeanServerHandler.dispatchRequest(MBeanServerHandler.java:154) at org.jolokia.backend.LocalRequestDispatcher.dispatchRequest(LocalRequestDispatcher.java:99) at org.jolokia.backend.BackendManager.callRequestDispatcher(BackendManager.java:413) at org.jolokia.backend.BackendManager.handleRequest(BackendManager.java:158) at org.jolokia.http.HttpRequestHandler.executeRequest(HttpRequestHandler.java:197) at org.jolokia.http.HttpRequestHandler.handlePostRequest(HttpRequestHandler.java:131) at org.jolokia.http.AgentServlet$3.handleRequest(AgentServlet.java:420) at org.jolokia.http.AgentServlet$2.run(AgentServlet.java:297) at org.jolokia.http.AgentServlet$2.run(AgentServlet.java:295) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.jolokia.http.AgentServlet.handleSecurely(AgentServlet.java:295) at org.jolokia.http.AgentServlet.handle(AgentServlet.java:277) at org.jolokia.http.AgentServlet.doPost(AgentServlet.java:244) at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1669) at org.eclipse.jetty.websocket.server.WebSocketUpgradeFilter.doFilter(WebSocketUpgradeFilter.java:201) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585) at org.ops4j.pax.web.service.jetty.internal.HttpServiceServletHandler.doHandle(HttpServiceServletHandler.java:71) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127) at org.ops4j.pax.web.service.jetty.internal.HttpServiceContext.doHandle(HttpServiceContext.java:287)
- Load Average is also very high. We can check this from TOP output too.
top - 12:18:52 up 9 days, 21:20, 9 users, load average: 17.47, 11.97, 7.28
- There are two samples in the attachment, sample1 is when Hawtio was running. Sample2 when hawtio wasn't running but their was connectivity with JON too.
- Customer have reported that after stopping Hawtio and JON for this particular container, things are back to normal and process is running fine. I have requested customer for dumps(thread-dump and heap-dump) for this normal state too, so that it helps us in comparing.
- From heapdump, we can find that .
Class Name | Shallow Heap | Retained Heap | Percentage ------------------------------------------------------------------------------------------------------- java.util.HashMap @ 0x6c80b1048 | 48 | 159,614,888 | 13.82% java.lang.Thread @ 0x6cb14f8b0 qtp2050874208-151130 Thread| 120 | 138,620,776 | 12.00% java.lang.Thread @ 0x6cb041508 qtp2050874208-151113 Thread| 120 | 138,340,184 | 11.98% java.lang.Thread @ 0x6c79e8e20 qtp2050874208-77 Thread | 120 | 131,524,152 | 11.39% -------------------------------------------------------------------------------------------------------
- Further if we went down.
Class Name | Shallow Heap | Retained Heap | Percentage ------------------------------------------------------------------------------------------------------- java.util.HashMap @ 0x6c80b1048 | 48 | 159,614,888 | 13.82% java.lang.Thread @ 0x6cb14f8b0 qtp2050874208-151130 Thread| 120 | 138,620,776 | 12.00% |- org.json.simple.JSONArray @ 0x6d5254100 | 24 | 90,055,152 | 7.80% java.lang.Thread @ 0x6cb041508 qtp2050874208-151113 Thread| 120 | 138,340,184 | 11.98% |- org.json.simple.JSONArray @ 0x6d526e670 | 24 | 90,052,544 | 7.80% java.lang.Thread @ 0x6c79e8e20 qtp2050874208-77 Thread | 120 | 131,524,152 | 11.39% |- org.json.simple.JSONArray @ 0x6d5252a58 | 24 | 90,841,144 | 7.86% -------------------------------------------------------------------------------------------------------
- Dumps are attached here.
Attachments
Issue Links
- relates to
-
ENTESB-6737 hawtio overwhelms server when viewing 50-60 queues (activemq tab)
- Closed