Jetty VS Tomcat Overview

Jetty and Tomcat are open servlet containers, both of them support HTTP server, HTTP client and javax.servlet container. In this article, we will quick view the difference between Jetty and Tomcat, and give the generic idea about which is the better one.

You may think it is not make sense to compare the two containers, tomcat is the one clearly discussed moreover than jetty, it supports a lot of wonderful options to developers, this is no doubt we start using tomcat throughout development because it’s easy-going and free, Its an foremost free application server and provided full web server functionality and can be stripped down to be embedded or built up an full J2EE server.

Jetty is a uniformly excellent tool about particularly feature. It has been started around since 1998 and claims to be a “100% Java HTTP Server and Servlet Container”. It is a foremost a set of software components that offer HTTP and servlet services. Jetty can be installed as a standalone application server or be easily embedded in an application or framework as a HTTP component. It is a simple servlet engine, as do a feature rich servlet engine or as do part of a full JEE environment.

Let’s take a look: Jetty VS Tomcat:

Popularity:

The following figure gives us a generic idea about which Java Containers / App Servers are used the most?
tomcat_vs_jetty_1
The results came from more than 1000+ developers survey, they reported what the container they are in use in enterprise production, for those participants they don’t use them now, they can choose what container they ever used or what they expect to be best. From the figure we can also see that Tomcat and Jetty are bigger winner open source containers. Tomcat  is the absolute primary container over all others.

Their Features and advantages:

Jetty Features and Powered:

  • Full-featured and standards-based.
  • Embeddable and Asynchronous.
  • Open source and commercially usable.
  • Dual licensed under Apache and Eclipse.
  • Flexible and extensible, Enterprise scalable.
  • Strong Tools, Application, Devices and Cloud computing supported.
  • Low maintenance cost.
  • Small and Efficient.

Tomcat Features and Powered:

  • Famous open source under Apache.
  • Easier to embed Tomcat in your applications, e.g. in JBoss.
  • Implements the Servlet 3.0, JSP 2.2 and JSP-EL 2.2 support.
  • Strong and widely commercially usable and use.
  • Easy integrated with other application such as Spring.
  • Flexible and extensible, Enterprise scalable.
  • Faster JSP parsing.
  • Stable.

Jetty VS Tomcat Performance benchmark

Test Environment:
CPU: Intel Core Dou T6400 2.0GHz
RAM: 2G
JDK: Jvm sun 1.6
OS: Ubuntu

I created the below code to test performance benchmark around the two containers, it is fairly simple but can gives us a generic idea. the servlet url is /servlet/TestRuning.

  PrintWriter out = response.getWriter();
  String aStr = request.getParameter("a");
  String bStr = request.getParameter("b");

  int a = 100;
  int b = 100;

  try{
   a = Integer.parseInt(aStr);
   b = Integer.parseInt(bStr);
  }catch(Exception excep){
   System.err.println("err:" + excep.getMessage());
  }
  int sum = 0;
  long s = System.currentTimeMillis();
  for(int i = 0; i < a; ++i){
   for(int ii = 0; ii < b; ++ii){
    sum = a / b;
   }
  }
  long e = System.currentTimeMillis();
  long d = e - s;
  out.println( d );

  out.flush();
  out.close();

We are now deploying this application into tomcat and jetty, we set them with both default configuration and same JRE version.

wapproxy@ubuntu:~$ ps -ef | grep java

wapproxy  2076     1  1 11:28 ?        00:00:03 /usr/lib/jvm/java-6-openjdk/jre/bin/java -Djetty.home=/home/wapproxy/jetty -Djava.io.tmpdir=/tmp -jar /home/wapproxy/jetty/start.jar /home/wapproxy/jetty/etc/jetty-logging.xml /home/wapproxy/jetty/etc/jetty.xml
wapproxy  2185  1398  8 11:30 pts/0    00:00:02 /usr/lib/jvm/java-6-openjdk/jre/bin/java -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager -Djava.util.logging.config.file=/home/wapproxy/Tomcat/conf/logging.properties -Djava.endorsed.dirs=/home/wapproxy/Tomcat/endorsed -classpath :/home/wapproxy/Tomcat/bin/bootstrap.jar -Dcatalina.base=/home/wapproxy/Tomcat -Dcatalina.home=/home/wapproxy/Tomcat -Djava.io.tmpdir=/home/wapproxy/Tomcat/temp org.apache.catalina.startup.Bootstrap start
wapproxy  2329  2309  0 11:31 pts/1    00:00:00 grep --color=auto java

The tomcat startup port is 8888 and Jetty port is 8080, Then we do pressure test:

This is Jetty Performance reports:

Server Software:        Jetty(6.1.22)
Server Hostname:        172.31.36.158
Server Port:            8080

Document Path:          /jt_jt/servlet/TestRuning?a=100000&b=100000
Document Length:        2 bytes

Concurrency Level:      1
Time taken for tests:   8.715 seconds
Complete requests:      5000
Failed requests:        0
Write errors:           0
Total transferred:      445000 bytes
HTML transferred:       10000 bytes
Requests per second:    573.72 [#/sec] (mean)
Time per request:       1.743 [ms] (mean)
Time per request:       1.743 [ms] (mean, across all concurrent requests)
Transfer rate:          49.86 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   1.1      0       5
Processing:     0    1   7.1      0      50
Waiting:        0    1   7.1      0      50
Total:          0    2   7.2      0      50

Percentage of the requests served within a certain time (ms)
  50%      0
  66%      0
  75%      0
  80%      0
  90%      5
  95%      5
  98%     45
  99%     50
 100%     50 (longest request)

This is Tomcat Performance reports:

Server Software:        Apache-Coyote/1.1
Server Hostname:        172.31.36.158
Server Port:            8888

Document Path:          /jt_jt/servlet/TestRuning?a=100000&b=100000
Document Length:        3 bytes

Concurrency Level:      1
Time taken for tests:   4.070 seconds
Complete requests:      5000
Failed requests:        0
Write errors:           0
Total transferred:      650000 bytes
HTML transferred:       15000 bytes
Requests per second:    1228.50 [#/sec] (mean)
Time per request:       0.814 [ms] (mean)
Time per request:       0.814 [ms] (mean, across all concurrent requests)
Transfer rate:          155.96 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   1.2      0       5
Processing:     0    0   1.7      0      45
Waiting:        0    0   1.7      0      45
Total:          0    1   2.1      0      45

Percentage of the requests served within a certain time (ms)
  50%      0
  66%      0
  75%      0
  80%      0
  90%      5
  95%      5
  98%      5
  99%      5
 100%     45 (longest request)

The following is the key data from our testing.

jetty 8080 Requests per second:    573.72 [#/sec] (mean)
tomcat 8888  Requests per second:    1228.50 [#/sec] (mean)

Let us see,  Tomcat handles 1228 requests per second but Jetty only goes 573 requests , so at least from this study statistics it reveals Tomcat does better.

More testing on Tomcat

Concurrent Requests Requests Waitting Time Requests Handling Time Throughput
1 0.422 0.422 2370.37
5 1.641 0.328 3047.62
10 3.125 0.313 3200
20 6.563 0.328 3047.62
40 12.5 0.313 3200
60 20.625 0.344 2909.09
80 25 0.313 3200
100 34.375 0.344 2909.09
200 596.875 2.984 335.08
300 618.75 2.063 484.85
400 1006.25 02.516 397.52

Jetty VS Tomcat

More testing on Jetty

Concurrent Requests Requests Waitting Time Requests Handling Time Throughput
1 6.391 6.391 156.48
5 11.484 2.297 435.37
10 19.063 1.906 524.59
20 25.625 1.281 780.49
40 0.797 31.875 1254.9
60 6.578 394.688 152.02
80 5.563 445 179.78
100 1.781 178.125 561.4
200 6.984 1396.875 143.18
300 3.109 932.813 321.61
400 6.531 2612.813 153.11

Jetty VS Tomcat

This is a simple and run on short time period testing, it draws one aspect performance. If you have any questions please let me know.