Răsfoiți Sursa

- Upgrade spark, hibernate and gson version (#4034)

- Changed c3p0 for hikaricp and dynamic pool size based on the number of cores.
- Replaced the custom template rendering for using spark-template-mustache
- Don't create a war deployed on resin anymore, use the micro-service approach using the embedded jetty.
- Update documentation
- Performance are skyrocketing with those modifications. Spark will not be last anymore! For the db benchmark I'm going from 45k requests to 114k on my laptop. For update, 20k to 35k. For json, 140k to 360k. for query, 45k to 110k.
jringuette 7 ani în urmă
părinte
comite
41441237c0

+ 21 - 27
frameworks/Java/spark/README.md

@@ -1,15 +1,14 @@
 # Spark Benchmarking Test
 
 This is the Spark portion of a [benchmarking test suite](../) comparing a variety of web development platforms.
-The test utilizes Spark routes, Gson for JSON serialization, Hibernate for ORM and a custom OSIV pattern created
-with Spark filters.
+The test utilizes Spark routes, Gson for JSON serialization, Hibernate for ORM and mustache templates rendering.
+HikariCP is used for connection pooling, using up to 2*cores count. See [pool sizing](https://github.com/brettwooldridge/HikariCP/wiki/About-Pool-Sizing)
 
 
 ## Tests
 
 * [Spark application](/src/main/java/hello/web/SparkApplication.java)
-* [Hibernate](http://www.hibernate.org/) configuration for local datasource and container managed JNDI
- * [JNDI configuration](/src/main/resources/hibernate-jndi.cfg.xml)
+* [Hibernate](http://www.hibernate.org/) configuration for local datasource
  * [Local datasource configuration](/src/main/resources/hibernate-local.cfg.xml)
  * [Hibernate utilities](/src/main/java/hello/web/HibernateUtil.java)
  * [Database entity](/src/main/java/hello/domain/World.java)
@@ -17,43 +16,38 @@ with Spark filters.
 
 ## Infrastructure Software Versions
 
-* [Spark 1.1](http://www.sparkjava.com/)
-* [Hibernate 4.2.6.Final](http://www.hibernate.org/)
-* [Gson 2.2.4](https://code.google.com/p/google-gson/)
+* [Spark 2.7.1](http://www.sparkjava.com/)
+* [Hibernate 5.3.6.Final](http://www.hibernate.org/)
+* [Gson 2.8.5](https://github.com/google/gson)
 
 
 ## Different test setups
 
-* Local environment with Spark's built in embedded jetty (port=4567, context=/)
+* Local environment with Spark's built in embedded jetty (port=8080, context=/)
  * Start application from [SparkApplication](/src/main/java/hello/web/SparkApplication.java)'s main method
- * 'standalone' maven profile must be enabled from [pom.xml](/pom.xml)
-* Local environment with Tomcat maven plugin (port=8080, context=/spark)
- * Start application with maven command 'mvn clean tomcat7:run'
- * No maven profiles must be enabled
-* Any servlet container with built WAR (port=any, context=/spark)
- * Create war with maven command 'mvn clean package'
- * No maven profiles must be enabled
- * Built war can be copied from /target/spark.war
-* Local datasource or JNDI datasource can be configured with system property 'jndi'
- * -Djndi=true or no property for JNDI datasource
- * -Djndi=false for local datasource
 
 ## Test URLs
 
 ### JSON Encoding Test
 
-http://localhost:4567/json
+http://localhost:8080/json
 
-http://localhost:8080/spark/json
+### Database Test
 
-### Data-Store/Database Mapping Test
+http://localhost:8080/db
 
-http://localhost:4567/db?queries=5
+### Query Test
 
-http://localhost:8080/spark/db?queries=5
+http://localhost:8080/db?queries=5
 
-### Plain Text Test
+### Update Test
+
+http://localhost:8080/updates?queries=5
+
+### Fortune cookie Test
 
-http://localhost:4567/plaintext
+http://localhost:8080/fortunes
+
+### Plain Text Test
 
-http://localhost:8080/spark/plaintext
+http://localhost:8080/plaintext

+ 1 - 1
frameworks/Java/spark/benchmark_config.json

@@ -17,7 +17,7 @@
       "flavor": "None",
       "orm": "Full",
       "platform": "Servlet",
-      "webserver": "Resin",
+      "webserver": "Jetty",
       "os": "Linux",
       "database_os": "Linux",
       "display_name": "spark",

+ 55 - 59
frameworks/Java/spark/pom.xml

@@ -5,16 +5,17 @@
     <groupId>hello.world</groupId>
     <artifactId>hello-spark</artifactId>
     <name>Spark Test project</name>
-    <packaging>war</packaging>
+    <packaging>jar</packaging>
     <version>1.0.0-BUILD-SNAPSHOT</version>
 
     <properties>
         <java-version>1.8</java-version>
-        <spark-version>2.3</spark-version>
-        <hibernate-version>4.3.0.Final</hibernate-version>
-        <gson-version>2.2.4</gson-version>
-        <mysql-connector-version>5.1.38</mysql-connector-version>
-        <slf4j-version>1.7.5</slf4j-version>
+        <spark-version>2.7.1</spark-version>
+        <hibernate-version>5.3.6.Final</hibernate-version>
+        <gson-version>2.8.5</gson-version>
+        <mysql-connector-version>5.1.47</mysql-connector-version>
+        <slf4j-version>1.7.25</slf4j-version>
+        <exec.mainClass>hello.web.SparkApplication</exec.mainClass>
     </properties>
 
     <prerequisites>
@@ -27,40 +28,29 @@
             <artifactId>spark-core</artifactId>
         </dependency>
         <dependency>
-            <groupId>javax.servlet</groupId>
-            <artifactId>javax.servlet-api</artifactId>
-            <version>3.1.0</version>
-            <scope>provided</scope>
+            <groupId>com.sparkjava</groupId>
+            <artifactId>spark-template-mustache</artifactId>
         </dependency>
         <dependency>
             <groupId>com.google.code.gson</groupId>
             <artifactId>gson</artifactId>
-            <version>${gson-version}</version>
         </dependency>
         <dependency>
             <groupId>org.hibernate</groupId>
             <artifactId>hibernate-core</artifactId>
-            <version>${hibernate-version}</version>
         </dependency>
         <dependency>
             <groupId>org.hibernate</groupId>
-            <artifactId>hibernate-c3p0</artifactId>
-            <version>${hibernate-version}</version>
+            <artifactId>hibernate-hikaricp</artifactId>
         </dependency>
         <dependency>
             <groupId>mysql</groupId>
             <artifactId>mysql-connector-java</artifactId>
-            <version>${mysql-connector-version}</version>
         </dependency>
         <dependency>
             <groupId>org.slf4j</groupId>
             <artifactId>slf4j-log4j12</artifactId>
         </dependency>
-        <dependency>
-            <groupId>com.j2html</groupId>
-            <artifactId>j2html</artifactId>
-            <version>0.7</version>
-        </dependency>
     </dependencies>
 
     <dependencyManagement>
@@ -69,16 +59,31 @@
                 <groupId>com.sparkjava</groupId>
                 <artifactId>spark-core</artifactId>
                 <version>${spark-version}</version>
-                <exclusions>
-                    <exclusion>
-                        <groupId>org.eclipse.jetty</groupId>
-                        <artifactId>jetty-server</artifactId>
-                    </exclusion>
-                    <exclusion>
-                        <groupId>org.eclipse.jetty</groupId>
-                        <artifactId>jetty-webapp</artifactId>
-                    </exclusion>
-                </exclusions>
+            </dependency>
+            <dependency>
+                <groupId>com.sparkjava</groupId>
+                <artifactId>spark-template-mustache</artifactId>
+                <version>${spark-version}</version>
+            </dependency>
+            <dependency>
+                <groupId>com.google.code.gson</groupId>
+                <artifactId>gson</artifactId>
+                <version>${gson-version}</version>
+            </dependency>
+            <dependency>
+                <groupId>org.hibernate</groupId>
+                <artifactId>hibernate-core</artifactId>
+                <version>${hibernate-version}</version>
+            </dependency>
+            <dependency>
+                <groupId>org.hibernate</groupId>
+                <artifactId>hibernate-hikaricp</artifactId>
+                <version>${hibernate-version}</version>
+            </dependency>
+            <dependency>
+                <groupId>mysql</groupId>
+                <artifactId>mysql-connector-java</artifactId>
+                <version>${mysql-connector-version}</version>
             </dependency>
             <dependency>
                 <groupId>org.slf4j</groupId>
@@ -93,21 +98,6 @@
         </dependencies>
     </dependencyManagement>
 
-    <profiles>
-        <profile>
-            <id>standalone</id>
-            <dependencyManagement>
-                <dependencies>
-                    <dependency>
-                        <groupId>com.sparkjava</groupId>
-                        <artifactId>spark-core</artifactId>
-                        <version>${spark-version}</version>
-                    </dependency>
-                </dependencies>
-            </dependencyManagement>
-        </profile>
-    </profiles>
-
     <build>
         <resources>
             <resource>
@@ -126,20 +116,26 @@
             </plugin>
             <plugin>
                 <groupId>org.apache.maven.plugins</groupId>
-                <artifactId>maven-war-plugin</artifactId>
-                <configuration>
-                    <warName>spark</warName>
-                </configuration>
-            </plugin>
-            <plugin>
-                <groupId>org.apache.tomcat.maven</groupId>
-                <artifactId>tomcat7-maven-plugin</artifactId>
-                <version>2.1</version>
-                    <configuration>
-                        <systemProperties>
-                            <jndi>false</jndi>
-                        </systemProperties>
-                    </configuration>
+                <artifactId>maven-shade-plugin</artifactId>
+                <version>3.1.0</version>
+                <executions>
+                    <execution>
+                        <phase>package</phase>
+                        <goals>
+                            <goal>shade</goal>
+                        </goals>
+                        <configuration>
+                            <transformers>
+                                <transformer
+                                        implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
+                                    <mainClass>${exec.mainClass}</mainClass>
+                                </transformer>
+                                <transformer
+                                        implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
+                            </transformers>
+                        </configuration>
+                    </execution>
+                </executions>
             </plugin>
         </plugins>
     </build>

+ 0 - 18
frameworks/Java/spark/resin.xml

@@ -1,18 +0,0 @@
-<resin xmlns="http://caucho.com/ns/resin"
-       xmlns:resin="http://caucho.com/ns/resin/core">
-
-    <cluster id="">
-        <resin:import path="/resin/conf/app-default.xml" />
-        
-        <log name="" level="config" path="stdout:" timestamp="[%H:%M:%S.%s] " />
-
-        <server id="">
-            <http port="8080" />
-        </server>
-
-        <host>
-            <web-app-deploy path="/resin/webapps"
-                            expand-preserve-fileset="WEB-INF/work/**"/>
-        </host>
-    </cluster>
-</resin>

+ 3 - 6
frameworks/Java/spark/spark.dockerfile

@@ -5,9 +5,6 @@ COPY pom.xml pom.xml
 RUN mvn package -q
 
 FROM openjdk:8-jdk
-WORKDIR /resin
-RUN curl -sL http://caucho.com/download/resin-4.0.56.tar.gz | tar xz --strip-components=1
-RUN rm -rf webapps/*
-COPY --from=maven /spark/target/spark.war webapps/ROOT.war
-COPY resin.xml conf/resin.xml
-CMD ["java", "-jar", "lib/resin.jar", "console"]
+WORKDIR /spark
+COPY --from=maven /spark/target/hello-spark-1.0.0-BUILD-SNAPSHOT.jar app.jar
+CMD ["java", "-server", "-XX:+UseNUMA", "-XX:+UseParallelGC", "-jar", "app.jar"]

+ 1 - 10
frameworks/Java/spark/src/main/java/hello/web/HibernateUtil.java

@@ -43,6 +43,7 @@ public class HibernateUtil {
             configuration.setProperty(AvailableSettings.USE_QUERY_CACHE, "false");
             configuration.setProperty(AvailableSettings.SHOW_SQL, "false");
             configuration.setProperty(AvailableSettings.CURRENT_SESSION_CONTEXT_CLASS, "thread");
+            configuration.setProperty("hibernate.hikari.maximumPoolSize", String.valueOf(Runtime.getRuntime().availableProcessors() * 2));
             configuration.addAnnotatedClass(World.class);
             configuration.addAnnotatedClass(Fortune.class);
             StandardServiceRegistryBuilder serviceRegistryBuilder = new StandardServiceRegistryBuilder().applySettings(configuration.getProperties());
@@ -54,17 +55,7 @@ public class HibernateUtil {
     }
 
     private static Configuration configuration() {
-        boolean jndi = Boolean.parseBoolean(System.getProperty("jndi", "true"));
         Configuration configuration = new Configuration();
-        // We're always going to use the -local config now since there were previous
-        // problems with the jndi config.
-        /*
-        if (jndi) {
-            configuration.configure("/hibernate-jndi.cfg.xml");
-        } else {
-            configuration.configure("/hibernate-local.cfg.xml");
-        }
-        */
         configuration.configure("/hibernate-local.cfg.xml");
         return configuration;
     }

+ 42 - 73
frameworks/Java/spark/src/main/java/hello/web/SparkApplication.java

@@ -1,65 +1,60 @@
 package hello.web;
 
-import static spark.Spark.after;
-import static spark.Spark.get;
+import com.google.gson.Gson;
+import hello.domain.Fortune;
 import hello.domain.Message;
 import hello.domain.World;
-import hello.domain.Fortune;
-
-import java.util.Date;
-import java.util.Random;
-import java.util.List;
-import java.util.Collections;
-import java.util.concurrent.ThreadLocalRandom;
-import java.util.stream.Collectors;
-
 import org.hibernate.Session;
 import org.hibernate.Transaction;
-
+import spark.ModelAndView;
 import spark.Request;
-import spark.Response;
+import spark.template.mustache.MustacheTemplateEngine;
 
-import com.google.gson.Gson;
+import java.util.Collections;
+import java.util.List;
+import java.util.Random;
+import java.util.concurrent.ThreadLocalRandom;
 
-import static j2html.TagCreator.*;
+import static spark.Spark.after;
+import static spark.Spark.get;
+import static spark.Spark.port;
 
-public class SparkApplication implements spark.servlet.SparkApplication {
+public class SparkApplication {
 
     private static final int DB_ROWS = 10000;
-    private static final int FORTUNE_ROWS = 12;
     private static final String MESSAGE = "Hello, World!";
     private static final String ADDITIONAL_FORTUNE = "Additional fortune added at request time.";
     private static final String CONTENT_TYPE_JSON = "application/json";
     private static final String CONTENT_TYPE_TEXT = "text/plain";
     private static final Gson GSON = new Gson();
 
-    private int getQueries(final Request request) {
-      try {
-        String param = request.queryParams("queries");
-        if (param == null) {
-          return 1;
-        }
+    private static int getQueries(final Request request) {
+        try {
+            String param = request.queryParams("queries");
+            if (param == null) {
+                return 1;
+            }
 
-        int queries = Integer.parseInt(param);
-        if (queries < 1) {
-          return 1;
-        }
-        if (queries > 500) {
-          return 500;
+            int queries = Integer.parseInt(param);
+            if (queries < 1) {
+                return 1;
+            }
+            if (queries > 500) {
+                return 500;
+            }
+            return queries;
+        } catch (NumberFormatException ex) {
+            return 1;
         }
-        return queries;
-      } catch (NumberFormatException ex) {
-        return 1;
-      }
     }
 
-    @Override
-    public void init() {
-
+    public static void main(String[] args) {
+        port(8080);
         get("/json", (request, response) -> {
-          response.type(CONTENT_TYPE_JSON);
-          return new Message(); }
-        , GSON::toJson);
+                    response.type(CONTENT_TYPE_JSON);
+                    return new Message();
+                }
+                , GSON::toJson);
         get("/db", (request, response) -> {
             response.type(CONTENT_TYPE_JSON);
 
@@ -102,44 +97,18 @@ public class SparkApplication implements spark.servlet.SparkApplication {
             return MESSAGE;
         });
         get("/fortunes", (request, response) -> {
-          final Session session = HibernateUtil.getSession();
-          Fortune newFortune = new Fortune();
-          newFortune.id = 0;
-          newFortune.message = ADDITIONAL_FORTUNE;
-          List<Fortune> fortunes = session.createCriteria(Fortune.class).list();
-          fortunes.add(newFortune);
-          Collections.sort(fortunes, (f1, f2) -> f1.message.compareTo(f2.message));
-          return document().render() +
-            html().with(
-                head().with(
-                    title("Fortunes")
-                ),
-                body().with(
-                    table().with(
-                        tr().with(
-                            th("id"),
-                            th("message")
-                        )).with(
-                        fortunes.stream().map((fortune) ->
-                            tr().with(
-                                td(Integer.toString(fortune.id)),
-                                td(fortune.message)
-                            )
-                        ).collect(Collectors.toList())
-                    )
-                )
-          ).render();
-
+            final Session session = HibernateUtil.getSession();
+            Fortune newFortune = new Fortune();
+            newFortune.id = 0;
+            newFortune.message = ADDITIONAL_FORTUNE;
+            List<Fortune> fortunes = session.createCriteria(Fortune.class).list();
+            fortunes.add(newFortune);
+            Collections.sort(fortunes, (f1, f2) -> f1.message.compareTo(f2.message));
+            return new MustacheTemplateEngine().render(new ModelAndView(Collections.singletonMap("fortunes", fortunes), "fortunes.mustache"));
         });
         after((request, response) -> {
             HibernateUtil.closeSession();
-            response.raw().addDateHeader("Date", new Date().getTime());
         });
     }
 
-    public static void main(final String[] args) {
-        System.setProperty("jndi", "false");
-        new SparkApplication().init();
-    }
-
 }

+ 0 - 8
frameworks/Java/spark/src/main/resources/hibernate-jndi.cfg.xml

@@ -1,8 +0,0 @@
-<!DOCTYPE hibernate-configuration PUBLIC
-        "-//Hibernate/Hibernate Configuration DTD 3.0//EN"
-        "http://www.hibernate.org/dtd/hibernate-configuration-3.0.dtd">
-<hibernate-configuration>
-    <session-factory>
-        <property name="hibernate.connection.datasource">java:comp/env/jdbc/hello_world</property>
-    </session-factory>
-</hibernate-configuration>

+ 0 - 4
frameworks/Java/spark/src/main/resources/hibernate-local.cfg.xml

@@ -7,9 +7,5 @@
         <property name="hibernate.connection.url">jdbc:mysql://tfb-database:3306/hello_world?jdbcCompliantTruncation=false&amp;elideSetAutoCommits=true&amp;useLocalSessionState=true&amp;cachePrepStmts=true&amp;cacheCallableStmts=true&amp;alwaysSendSetIsolation=false&amp;prepStmtCacheSize=4096&amp;cacheServerConfiguration=true&amp;prepStmtCacheSqlLimit=2048&amp;zeroDateTimeBehavior=convertToNull&amp;traceProtocol=false&amp;useUnbufferedInput=false&amp;useReadAheadInput=false&amp;maintainTimeStats=false&amp;useServerPrepStmts&amp;cacheRSMetadata=true&amp;useSSL=false</property>
         <property name="hibernate.connection.username">benchmarkdbuser</property>
         <property name="hibernate.connection.password">benchmarkdbpass</property>
-        <property name="hibernate.c3p0.min_size">32</property>
-        <property name="hibernate.c3p0.max_size">256</property>
-        <property name="hibernate.c3p0.timeout">1800</property>
-        <property name="hibernate.c3p0.max_statements">50</property>
     </session-factory>
 </hibernate-configuration>

+ 2 - 1
frameworks/Java/spark/src/main/resources/log4j.properties

@@ -1,4 +1,4 @@
-log4j.rootLogger=WARN, console
+log4j.rootLogger=INFO, console
 
 log4j.appender.console=org.apache.log4j.ConsoleAppender
 log4j.appender.console.layout=org.apache.log4j.PatternLayout
@@ -6,3 +6,4 @@ log4j.appender.console.layout.ConversionPattern=%d %-5p %c %x - %m%n
 log4j.appender.console.Target=System.err
 
 log4j.logger.hello=DEBUG
+log4j.logger.org.hibernate=ERROR

+ 20 - 0
frameworks/Java/spark/src/main/resources/templates/fortunes.mustache

@@ -0,0 +1,20 @@
+<!DOCTYPE html>
+<html>
+<head>
+  <title>Fortunes</title>
+</head>
+<body>
+<table>
+  <tr>
+    <th>id</th>
+    <th>message</th>
+  </tr>
+  {{#fortunes}}
+  <tr>
+    <td>{{id}}</td>
+    <td>{{message}}</td>
+  </tr>
+  {{/fortunes}}
+</table>
+</body>
+</html>

+ 0 - 13
frameworks/Java/spark/src/main/webapp/WEB-INF/resin-web.xml

@@ -1,13 +0,0 @@
-<web-app xmlns="http://caucho.com/ns/resin">
-
-<database jndi-name='jdbc/hello_world'>
-  <driver>
-    <type>com.mysql.jdbc.jdbc2.optional.MysqlConnectionPoolDataSource</type>
-    <url>jdbc:mysql://tfb-database:3306/hello_world?jdbcCompliantTruncation=false&amp;elideSetAutoCommits=true&amp;useLocalSessionState=true&amp;cachePrepStmts=true&amp;cacheCallableStmts=true&amp;alwaysSendSetIsolation=false&amp;prepStmtCacheSize=4096&amp;cacheServerConfiguration=true&amp;prepStmtCacheSqlLimit=2048&amp;zeroDateTimeBehavior=convertToNull&amp;traceProtocol=false&amp;useUnbufferedInput=false&amp;useReadAheadInput=false&amp;maintainTimeStats=false&amp;useServerPrepStmts&amp;cacheRSMetadata=true&amp;useSSL=false</url>
-    <user>benchmarkdbuser</user>
-    <password>benchmarkdbpass</password>
-    <useUnicode/>
-  </driver>
-</database>
-
-</web-app>

+ 0 - 20
frameworks/Java/spark/src/main/webapp/WEB-INF/web.xml

@@ -1,20 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<web-app version="2.5" xmlns="http://java.sun.com/xml/ns/javaee"
-    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
-    xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd">
- 
-    <filter>
-        <filter-name>SparkFilter</filter-name>
-        <filter-class>spark.servlet.SparkFilter</filter-class>
-        <init-param>
-            <param-name>applicationClass</param-name>
-            <param-value>hello.web.SparkApplication</param-value>
-        </init-param>
-    </filter>
-   
-    <filter-mapping>
-      <filter-name>SparkFilter</filter-name>
-      <url-pattern>/*</url-pattern>
-    </filter-mapping>
-
-</web-app>