#benchmark #benchmarks #performance #web-frameworks
|
11 years ago | |
---|---|---|
HttpListener | 11 years ago | |
WeberFramework | 11 years ago | |
activeweb | 11 years ago | |
aspnet | 11 years ago | |
aspnet-stripped | 11 years ago | |
beego | 11 years ago | |
bottle | 11 years ago | |
cake | 11 years ago | |
compojure | 11 years ago | |
config | 11 years ago | |
cowboy | 11 years ago | |
cpoll_cppsp | 11 years ago | |
curacao | 11 years ago | |
dancer | 11 years ago | |
dart | 11 years ago | |
dart-start | 11 years ago | |
dart-stream | 11 years ago | |
django | 11 years ago | |
dropwizard | 11 years ago | |
dropwizard-mongodb | 11 years ago | |
elli | 11 years ago | |
evhttp-sharp | 11 years ago | |
express | 11 years ago | |
falcon | 11 years ago | |
falcore | 11 years ago | |
finagle | 11 years ago | |
flask | 11 years ago | |
gemini | 11 years ago | |
go | 11 years ago | |
gorail | 11 years ago | |
grails | 11 years ago | |
grizzly-bm | 11 years ago | |
grizzly-jersey | 11 years ago | |
hapi | 11 years ago | |
hhvm | 11 years ago | |
http-kit | 11 years ago | |
jester | 11 years ago | |
jetty-servlet | 11 years ago | |
kelp | 11 years ago | |
lapis | 11 years ago | |
leda | 11 years ago | |
lift-stateless | 11 years ago | |
luminus | 11 years ago | |
mojolicious | 11 years ago | |
nancy | 11 years ago | |
nawak | 11 years ago | |
netty | 11 years ago | |
ninja-resin | 11 years ago | |
ninja-standalone | 11 years ago | |
nodejs | 11 years ago | |
onion | 11 years ago | |
openresty | 11 years ago | |
php | 11 years ago | |
php-codeigniter | 11 years ago | |
php-fuel | 11 years ago | |
php-kohana | 11 years ago | |
php-laravel | 11 years ago | |
php-lithium | 11 years ago | |
php-micromvc | 11 years ago | |
php-phalcon | 11 years ago | |
php-phalcon-micro | 11 years ago | |
php-phpixie | 11 years ago | |
php-pimf | 11 years ago | |
php-senthot | 11 years ago | |
php-silex | 11 years ago | |
php-silex-orm | 11 years ago | |
php-silica | 11 years ago | |
php-slim | 11 years ago | |
php-symfony2 | 11 years ago | |
php-symfony2-stripped | 11 years ago | |
php-yaf | 11 years ago | |
php-yii2 | 11 years ago | |
php-zend-framework | 11 years ago | |
phreeze | 11 years ago | |
plack | 11 years ago | |
plain | 11 years ago | |
play-activate-mysql | 11 years ago | |
play-java | 11 years ago | |
play-java-jpa | 11 years ago | |
play-scala | 11 years ago | |
play-scala-mongodb | 11 years ago | |
play-slick | 11 years ago | |
play1 | 11 years ago | |
play1siena | 11 years ago | |
pyramid | 11 years ago | |
rack | 11 years ago | |
racket-ws | 11 years ago | |
rails | 11 years ago | |
rails-stripped | 11 years ago | |
restexpress | 11 years ago | |
revel | 11 years ago | |
revel-jet | 11 years ago | |
revel-qbs | 11 years ago | |
ringojs | 11 years ago | |
ringojs-convenient | 11 years ago | |
sbt | 11 years ago | |
scalatra | 11 years ago | |
servicestack | 11 years ago | |
servlet | 11 years ago | |
servlet3-cass | 11 years ago | |
sinatra | 11 years ago | |
snap | 11 years ago | |
spark | 11 years ago | |
spray | 11 years ago | |
spring | 11 years ago | |
tapestry | 11 years ago | |
toolset | 11 years ago | |
tornado | 11 years ago | |
treefrog | 11 years ago | |
undertow | 11 years ago | |
undertow-edge | 11 years ago | |
unfiltered | 11 years ago | |
urweb | 11 years ago | |
uwsgi | 11 years ago | |
vertx | 11 years ago | |
wai | 11 years ago | |
web-simple | 11 years ago | |
webgo | 11 years ago | |
wicket | 11 years ago | |
wildfly-ee7 | 11 years ago | |
wsgi | 11 years ago | |
wt | 11 years ago | |
yesod | 11 years ago | |
.gitignore | 11 years ago | |
LICENSE | 12 years ago | |
README.md | 11 years ago | |
benchmark.cfg.example | 11 years ago |
This project provides representative performance measures across a wide field of web application frameworks. With much help from the community, coverage is quite broad and we are happy to broaden it further with contributions. The project presently includes frameworks on many languages including Go, Python, Java, Ruby, PHP, Clojure, Groovy, JavaScript, Erlang, Haskell, Scala, Lua, and C. The current tests exercise plaintext responses, JSON seralization, database reads and writes via the object-relational mapper (ORM), collections, sorting, server-side templates, and XSS counter-measures. Future tests will exercise other components and greater computation.
Read more and see the results of our tests on Amazon EC2 and physical hardware at http://www.techempower.com/benchmarks/
Join in the conversation at our Google Group: https://groups.google.com/forum/?fromgroups=#!forum/framework-benchmarks
Before starting setup, all the required hosts must be provisioned, with the respective operating system and required software installed, and with connectivity for remote management (SSH on Linux, RDP and WinRM on Windows).
Refer to Benchmark Suite Deployment README file for the provisioning procedures documentation.
NOTE: If testing a pull request or doing development, it is usually adequate to only use one computer. In that case, your server, client, and database IPs will be 127.0.0.1
Install Ubuntu 14.04 with username tfb
. Ensure that OpenSSH is selected when you install. If not, run the following command
$ sudo apt-get install openssh-server
If Ubuntu is already installed, run the following command and follow the prompts.
$ sudo useradd -m -G sudo tfb
Log in as tfb
Fully update NOTE: If you update the kernel (linux-firmware), it is generally a good idea to reboot aftewards.
$ sudo apt-get update && sudo apt-get upgrade
Run the command: sudo visudo
Change line 20 in from %sudo ALL=(ALL:ALL) ALL
to %sudo ALL=(ALL) NOPASSWD: ALL
Run the following (Don't enter a password, just hit enter when the prompt pops up). NOTE This is still necessary if the client and database are on the same computer as the server
$ ssh-keygen
$ ssh-copy-id <database ip>
$ ssh-copy-id <client ip>
Install git and clone the Framework Benchmarks repository
$ sudo apt-get install git
$ cd ~
$ git clone https://github.com/TechEmpower/FrameworkBenchmarks.git
$ cd FrameworkBenchmarks
Install the server software. This will take a long time
$ nohup python toolset/run-tests.py -s <server hostname/ip> -c <client hostname/ip> -u tfb --install server --list-tests &
If you want to view the process of installing, do the following. The session can be interrupted easily so no need to worry about keeping a connection.
$ tail -f nohup.out
Reboot when the install is done
Edit your ~/.bashrc file to change the following
TFB_SERVER_HOST=<ip address>
to the server's IP addressTFB_CLIENT_HOST=<ip address>
to the client's ip addressTFB_DATABASE_HOST=<ip address>
to the database's ip address.TFB_CLIENT_IDENTITY_FILE=<path>
to the id file you specified when you ran ssh-keygen (probably /home/tfb/.ssh/id_rsa if you don't know what it is)source ~/.bashrc
If you are setting up any other servers, do so before proceeding.
Run the following commands
cd ~/FrameworkBenchmarks
source ~/.bash_profile
# For your first time through the tests, set the ulimit for open files
ulimit -n 8192
# Most software is installed automatically by the script, but running the mongo command below from
# the install script was causing some errors. For now this needs to be run manually.
cd installs && curl -sS https://getcomposer.org/installer | php -- --install-dir=bin
cd ..
sudo apt-get remove --purge openjdk-6-jre openjdk-6-jre-headless
# Change database-private-ip to the database ip
mongo --host database-private-ip < config/create.js
tfb
tfb
Fully update NOTE: If you update the kernel (linux-firmware), it is generally a good idea to reboot aftewards.
$ sudo apt-get update && sudo apt-get upgrade
Run the command: sudo visudo
Change line 20 in from %sudo ALL=(ALL:ALL) ALL
to %sudo ALL=(ALL) NOPASSWD: ALL
On the app server, run the following from the FrameworkBenchmark directory (this should only take a small amount of time, several minutes or so):
$ toolset/run-tests.py --install database --list-tests
tfb
tfb
Fully update NOTE: If you update the kernel (linux-firmware), it is generally a good idea to reboot aftewards.
$ sudo apt-get update && sudo apt-get upgrade
Run the command: sudo visudo
Change line 20 in from %sudo ALL=(ALL:ALL) ALL
to %sudo ALL=(ALL) NOPASSWD: ALL
On the app server, run the following from the FrameworkBenchmark directory (this should only take a small amount of time, several minutes or so):
$ toolset/run-tests.py --install client --list-tests
You can validate that the setup worked by running a smoke test like this:
toolset/run-tests.py --max-threads 1 --name smoketest --test servlet-raw --type all -m verify
This should run the verification step for a single framework.
installer-bootstrap.ps1
from "toolset/setup/windows" to the server (use CTRL-C and CTRL-V).Run with PowerShell
.installer.ps1
from the repository, which will install everything else.python
, git
, ssh
, curl
, node
etc. and verify that everything works + PowerShell goodies.The client/database machine is still assumed to be a Linux box. You can install just the client software via
python toolset\run-tests.py -s server-private-ip -c client-private-ip -i "C:\Users\Administrator\Desktop\client.key" --install-software --install client --list-tests
but this step is not required if you already installed the Linux server and client as described above.
Now you can run tests:
python toolset\run-tests.py -s server-private-ip -c client-private-ip -i "C:\Users\Administrator\Desktop\client.key" --max-threads 2 --duration 30 --sleep 5 --name win --test aspnet --type all
Command Prompt
as Administrator.Enter this command:
powershell -ExecutionPolicy Bypass -Command "iex (New-Object Net.WebClient).DownloadString('https://raw.github.com/TechEmpower/FrameworkBenchmarks/master/toolset/setup/sqlserver/setup-sqlserver-bootstrap.ps1')"
This will configure SQL Server, the Windows Firewall, and populate the database.
Now, when running run-tests.py
, just add -d <ip of SQL Server instance>
. This works for the (Windows Server-based) aspnet-sqlserver-raw
and aspnet-sqlserver-entityframework
tests.
We ran our tests using three dedicated i7 2600k machines, three EC2 m1.large instances, and three servers from Peak Hosting
We hope that the community will help us in making these tests better, so if you'd like to make any changes to the tests we currently have, here are some things to keep in mind.
If you're updating a dependency of a framework that uses a dependency management system (Bundler, npm, etc.), please be specific with the version number that you are updating to.
Also, if you do change the dependency of any test, please update the README file for that test to reflect that change, we want to try and keep the README files as up to date as possible.
If you would like to update any of the software used, again, please be as specific as possible, while we still install some software via apt-get and don't specify a version, we would like to have as much control over the versions as possible.
The main file that installs all the software is in toolset/setup/linux/installer.py
. It's broken up into two sections, server software and client software.
Additionally, it may be necessary to update the setup.py file in the framework's directory to use this new version.
If you update any software, please update the README files of any tests that use that software.
When adding a new framework or new test to an existing framework, please follow these steps:
For descriptions of the test types that we run against each framework, see the test requirements section of the Results web site.
The benchmark_config file is used by our scripts to both identify the available tests and to extract metadata describing each test.
This file should exist at the root of the test directory.
Here is the basic structure of benchmark_config, using the Compojure framework as an example. Compojure has two test permutations, which are identified as the "tests" list in the JSON structure below.
{
"framework": "compojure",
"tests": [{
"default": {
"setup_file": "setup",
"json_url": "/compojure/json",
"db_url": "/compojure/db/1",
"query_url": "/compojure/db/",
"fortune_url": "/compojure/fortune-hiccup",
"plaintext_url": "/compojure/plaintext",
"port": 8080,
"approach": "Realistic",
"classification": "Micro",
"database": "MySQL",
"framework": "compojure",
"language": "Clojure",
"orm": "Micro",
"platform": "Servlet",
"webserver": "Resin",
"os": "Linux",
"database_os": "Linux",
"display_name": "compojure",
"notes": "",
"versus": "servlet"
},
"raw": {
"setup_file": "setup",
"db_url": "/compojure/dbraw/1",
"query_url": "/compojure/dbraw/",
"port": 8080,
"approach": "Realistic",
"classification": "Micro",
"database": "MySQL",
"framework": "compojure",
"language": "Clojure",
"orm": "Raw",
"platform": "Servlet",
"webserver": "Resin",
"os": "Linux",
"database_os": "Linux",
"display_name": "compojure-raw",
"notes": "",
"versus": "servlet"
}
}]
}
/json
/db
/fortune
/plaintext
Realistic
or Stripped
(see results web site for description of all metadata attributes)Full
, Micro
, or Platform
MySQL
, Postgres
, MongoDB
, SQLServer
, or None
Full
, Micro
, or Raw
Linux
or Windows
Linux
or Windows
If your framework and platform can execute on both Windows and Linux, we encourage you to specify tests for both operating systems. This increases the amount of testing you should do before submitting your pull-request, however, so we understand if you start with just one of the two.
The steps involved are:
benchmark_config
file for the Windows test (or vice-versa). When the benchmark script runs on Linux, it skips tests where the Application Operating System (os
in the file) is specified as Linux. When running on Windows, it skips tests where the os
field is Linux.The install.sh
file for each framework starts the bash process which will
install that framework. Typically, the first thing done is to call fw_depends
to run installations for any necessary software that TFB has already
created installation scripts for. TFB provides a reasonably wide range of
core software, so your install.sh
may only need to call fw_depends
and
exit. Note: fw_depends
does not guarantee dependency installation, so
list software in the proper order e.g. if foo
depends on bar
use fw_depends bar foo
.
Here are some example install.sh
files
#!/bin/bash
# My server only needs nodejs
fw_depends nodejs
#!/bin/bash
# My server is weird and needs nodejs and mono and go
fw_depends nodejs mono go
#!/bin/bash
# My server needs nodejs...
fw_depends nodejs mono go
# ...and some other software that there is no installer script for.
# Note: Use IROOT variable to put software in the right folder.
# You can also use FWROOT to refer to the project root, or
# TROOT to refer to the root of your framework
# Please see guidelines on writing installation scripts
wget mystuff.tar.gz -O mystuff.tar.gz
untar mystuff.tar.gz
cd mystuff
make --prefix=$IROOT && sudo make install
To see what TFB provides installations for, look in toolset/setup/linux
in the folders frameworks
, languages
, systools
, and webservers
.
You should pass the filename, without the ".sh" extension, to fw_depends.
Here is a listing as of July 2014:
$ ls frameworks
grails.sh nawak.sh play1.sh siena.sh vertx.sh yesod.sh
jester.sh onion.sh play2.sh treefrog.sh wt.sh
$ ls languages
composer.sh erlang.sh hhvm.sh mono.sh perl.sh pypy.sh racket.sh urweb.sh
dart.sh go.sh java.sh nimrod.sh phalcon.sh python2.sh ringojs.sh xsp.sh
elixir.sh haskell.sh jruby.sh nodejs.sh php.sh python3.sh ruby.sh yaf.sh
$ ls systools
leiningen.sh maven.sh
$ ls webservers
lapis.sh mongrel2.sh nginx.sh openresty.sh resin.sh weber.sh zeromq.sh
The bash_profile.sh
file is sourced before installing software or before
running the framework test. This is mostly used when running your
framework, to perform actions such as updating PATH
or defining environment
variables your framework requires e.g. GOROOT
. You can use these
variables:
Example of bash_profile.sh
:
# Set the root of our go installation
export GOROOT=${IROOT}/go
# Where to find the go executable
export PATH="$GOROOT/bin:$PATH"
export GOPATH=${FWROOT}/go
Do not cause any output, such as using echo
, inside of bash_profile.sh
The setup file is responsible for starting and stopping the test. This script is responsible for (among other things):
The setup file is a python script that contains a start() and a stop() function. The start function should build the source, make any necessary changes to the framework's configuration, and then start the server. The stop function should shutdown the server, including all sub-processes as applicable.
By convention, the configuration files used by a framework should specify the database server as localhost
so that developing tests in a single-machine environment can be done in an ad hoc fashion, without using the benchmark scripts.
When running a benchmark script, the script needs to modify each framework's configuration so that the framework connects to a database host provided as a command line argument. In order to do this, use setup_util.replace_text() to make necessary modifications prior to starting the server.
For example:
setup_util.replace_text("wicket/src/main/webapp/WEB-INF/resin-web.xml", "mysql:\/\/.*:3306", "mysql://" + args.database_host + ":3306")
Using localhost
in the raw configuration file is not a requirement as long as the replace_text
call properly injects the database host provided to the benchmarker toolset as a command line argument.
Here is an example of Wicket's setup file.
import subprocess
import sys
import setup_util
##################################################
# start(args, logfile, errfile)
#
# Starts the server for Wicket
# returns 0 if everything completes, 1 otherwise
##################################################
def start(args, logfile, errfile):
# setting the database url
setup_util.replace_text("wicket/src/main/webapp/WEB-INF/resin-web.xml", "mysql:\/\/.*:3306", "mysql://" + args.database_host + ":3306")
# 1. Compile and package
# 2. Clean out possible old tests
# 3. Copy package to Resin's webapp directory
# 4. Start resin
try:
subprocess.check_call("mvn clean compile war:war", shell=True, cwd="wicket", stderr=errfile, stdout=logfile)
subprocess.check_call("rm -rf $RESIN_HOME/webapps/*", shell=True, stderr=errfile, stdout=logfile)
subprocess.check_call("cp wicket/target/hellowicket-1.0-SNAPSHOT.war $RESIN_HOME/webapps/wicket.war", shell=True, stderr=errfile, stdout=logfile)
subprocess.check_call("$RESIN_HOME/bin/resinctl start", shell=True, stderr=errfile, stdout=logfile)
return 0
except subprocess.CalledProcessError:
return 1
##################################################
# stop(logfile, errfile)
#
# Stops the server for Wicket
# returns 0 if everything completes, 1 otherwise
##################################################
def stop(logfile):
try:
subprocess.check_call("$RESIN_HOME/bin/resinctl shutdown", shell=True, stderr=errfile, stdout=logfile)
return 0
except subprocess.CalledProcessError:
return 1
A contributor named @kpacha has built a pure JavaScript tool for generating the setup.py
file for a new framework via an in-browser form. Check out his FrameworkBenchmarks Setup Builder.