Uploaded image for project: 'JBoss A-MQ'
  1. JBoss A-MQ
  2. ENTMQ-1468

Unresponsive Console and High Hawtio CPU Usage On Client and Server followed by OOM

    XMLWordPrintable

Details

    • Bug
    • Resolution: Duplicate
    • Critical
    • None
    • JBoss A-MQ 6.2.1
    • hawtio (console)
    • None
    • User Experience
    • Hide

      Configure A-MQ by adding 2500 destinations (queues) to broker.xml.
      Start A-MQ
      Wait a few moments and browse to the Hawtio console. Watch cpu and memory usage in top or similar utility.
      Go to preferences/jolokia and set MaxCollectionSize to 2500 and reduce Max Depth of JSON objects to 3 and apply the change
      Browser CPU usage climbs to 100%+
      Unresponsive Page / Unresponsive Script Warnings are thrown in the browser
      Heap dumps of the server show climb in memory usage, particularly java.util.HashMap$Entry and java.util.HashMap$Entry[] and char[].

      If the client windows are held open and unresponsive page/script warnings are dismissed with "wait for page/script," memory usage climbs, spiking CPU usage and the container crashes with OOM (heap).

      Show
      Configure A-MQ by adding 2500 destinations (queues) to broker.xml. Start A-MQ Wait a few moments and browse to the Hawtio console. Watch cpu and memory usage in top or similar utility. Go to preferences/jolokia and set MaxCollectionSize to 2500 and reduce Max Depth of JSON objects to 3 and apply the change Browser CPU usage climbs to 100%+ Unresponsive Page / Unresponsive Script Warnings are thrown in the browser Heap dumps of the server show climb in memory usage, particularly java.util.HashMap$Entry and java.util.HashMap$Entry[] and char[]. If the client windows are held open and unresponsive page/script warnings are dismissed with "wait for page/script," memory usage climbs, spiking CPU usage and the container crashes with OOM (heap).

    Description

      When A-MQ is configured with a large number of destinations and a client browser directed to hawtio, cpu usage spikes on both the client and server - both firefox and chrome showed 100%+ cpu usage, while server cpu usage jumped to 300-400%.

      Eventually the following server stacktrace was delivered to the client browser / logged in the server log:

      java.lang.OutOfMemoryError: Java heap space
      at java.util.Arrays.copyOfRange(Arrays.java:2694)[:1.7.0_91]
      at java.lang.String.<init>(String.java:203)[:1.7.0_91]
      at java.lang.StringBuffer.toString(StringBuffer.java:561)[:1.7.0_91]
      at org.json.simple.JSONObject.toJSONString(JSONObject.java:121)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.json.simple.JSONObject.toJSONString(JSONObject.java:101)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.json.simple.JSONObject.toJSONString(JSONObject.java:108)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.json.simple.JSONValue.toJSONString(JSONValue.java:199)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.json.simple.JSONObject.toJSONString(JSONObject.java:119)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.json.simple.JSONObject.toJSONString(JSONObject.java:101)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.json.simple.JSONObject.toJSONString(JSONObject.java:108)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.json.simple.JSONValue.toJSONString(JSONValue.java:199)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.json.simple.JSONObject.toJSONString(JSONObject.java:119)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.json.simple.JSONObject.toJSONString(JSONObject.java:101)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.json.simple.JSONObject.toJSONString(JSONObject.java:108)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.jolokia.http.AgentServlet.handle(AgentServlet.java:292)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.jolokia.http.AgentServlet.doPost(AgentServlet.java:252)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at javax.servlet.http.HttpServlet.service(HttpServlet.java:595)[48:org.apache.geronimo.specs.geronimo-servlet_3.0_spec:1.0.0]
      at javax.servlet.http.HttpServlet.service(HttpServlet.java:668)[48:org.apache.geronimo.specs.geronimo-servlet_3.0_spec:1.0.0]
      at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)[95:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
      at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1496)[95:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
      at io.hawt.web.AuthenticationFilter$2.run(AuthenticationFilter.java:166)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at java.security.AccessController.doPrivileged(Native Method)[:1.7.0_91]
      at javax.security.auth.Subject.doAs(Subject.java:415)[:1.7.0_91]
      at io.hawt.web.AuthenticationFilter.executeAs(AuthenticationFilter.java:163)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at io.hawt.web.AuthenticationFilter.doFilter(AuthenticationFilter.java:130)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1467)[95:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
      at io.hawt.web.XFrameOptionsFilter.doFilter(XFrameOptionsFilter.java:28)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1467)[95:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
      at io.hawt.web.CORSFilter.doFilter(CORSFilter.java:42)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1467)[95:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
      at io.hawt.web.CacheHeadersFilter.doFilter(CacheHeadersFilter.java:37)[163:io.hawt.hawtio-web:1.4.0.redhat-621084]
      at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1467)[95:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
      2015-12-30 17:07:57,704 | WARN | tp211467503-3187 | nio | 95 - org.eclipse.jetty.aggregate.jetty-all-server - 8.1.17.v20150415 | handle failed
      java.lang.OutOfMemoryError: GC overhead limit exceeded

      Attachments

        Activity

          People

            dejanbosanac Dejan Bosanac
            rhn-support-dhawkins Duane Hawkins
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: